Publications-Theses
Article View/Open
Publication Export
-
題名 深度學習在不平衡數據集之研究
Survey on Deep Learning with Imbalanced Data Sets作者 蔡承孝
Tsai, Cheng-Hsiao貢獻者 蔡炎龍
蔡承孝
Tsai, Cheng-Hsiao關鍵詞 深度學習
卷積神經網路
不平衡數據集
異常偵測
圖像分類
Deep Learning
CNN
Imbalanced Data Sets
Anomaly Detection
Image Classification日期 2019 上傳時間 3-Oct-2019 17:17:29 (UTC+8) 摘要 本文旨在回顧利用深度學習處理不平衡數據集和異常偵測的方法,我們 從 MNIST 生成兩個高度不平衡數據集,不平衡比率高達 2500 並應用在多 元分類任務跟二元分類任務上,在二元分類任務中第 0 類為少數類;而在 多元分類任務中少數類為第 0、1、4、6、7 類,我們利用卷積神機網路來 訓練我們的模型。在異常偵測方面,我們用預先訓練好的手寫辨識 CNN 模 型來判斷其他 18 張貓狗的圖片是否為手寫辨識圖片。由於數據的高度不平衡,原始分類模型的表現不盡理想。因此,在不同 的分類任務上,我們分別利用 6 個和 7 個不同的方法來調整我們的模型。我 們發現新的損失函數 Focalloss 在多元分類任務表現最好,而在二元分類任務中隨機過採樣的表現最佳,但是成本敏感學習的方法並不適用於我們所生成的不平衡數據集。我們利用信心估計讓分類器成功判斷所有貓狗圖片皆不是手寫辨識圖片。
This paper is a survey on deep learning with imbalanced data sets and anomaly detection. We create two imbalanced data sets from MNIST for multi-classification task with minority classes 0,1,4,6,7 and binary classification task with minority class 0. Our data sets are highly imbalanced with imbalanced rate ρ = 2500 and we use convolutional neural network(CNN) for training. In anomaly detection,we use the pretrained CNN handwriting classifier to decide the 18 cat and dog pictures are handwriting pictures or not.Due to the data set is imbalanced, the baseline model have poor performance on minority classes. Hence, we use 6 and 7 different methods to adjust our model. We find that the focal loss function and random over-sampling(ROS) have best performance on multi-classification task and binary classification task on our imbalanced data sets but the cost sensitive learning method is not suitable for our imbalanced data sets. By confidence estimation, our classifier successfully judge all the pictures of cat and dog are not handwriting picture.參考文獻 [1] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014.[2] Mateusz Buda, Atsuto Maki, and Maciej A Mazurowski. A systematic study of the class imbalance problem in convolutional neural networks. Neural Networks, 106:249–259, 2018.[3] MHB Carvalho, ML Brizot, LM Lopes, CH Chiba,S Miyadahira, and M Zugaib. Detection of fetal structural abnormalities at the 11–14 week ultrasound scan. Prenatal Diagnosis: Published in Affiliation With the International Society for Prenatal Diagnosis, 22(1):1–4, 2002.[4] Varun Chandola, Arindam Banerjee, and Vipin Kumar. Anomaly detection: A survey. ACM computing surveys(CSUR), 41(3):15, 2009.[5] Nitesh V Chawla, KevinW Bowyer, Lawrence OHall, and W Philip Kegelmeyer. Smote: synthetic minority over-sampling technique. Journal of artificial intelligence research, 16:321–357, 2002.[6] Edward Choi, Andy Schuetz, Walter F Stewart, and Jimeng Sun. Using recurrent neural network models for early detection of heart failure onset. Journal of the American Medical Informatics Association, 24(2):361–370, 2016.[7] David A Cieslak, Nitesh V Chawla, and Aaron Striegel. Combating imbalance in network intrusion datasets. In GrC, pages 732–737, 2006.[8] Ronan Collobert and Jason Weston. A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning, pages 160–167. ACM, 2008.[9] MJ Desforges, PJ Jacob, and JE Cooper. Applications of probability density estimation to the detection of abnormal conditions in engineering. Proceedings of the Institution of Mechanical Engineers, PartC: Journal of Mechanical Engineering Science, 212(8):687– 703,1998.[10] Chris Drummond,Robert CHolte, et al. C4. 5, class imbalance, and cost sensitivity: why under-sampling beats over-sampling. In Workshop on learning from imbalanced datasets II, volume 11, pages 1–8. Citeseer, 2003.[11] CharlesElkan. The foundations of cost-sensitive learning. In International joint conference on artificial intelligence, volume 17, pages 973–978. Lawrence Erlbaum Associates Ltd, 2001.[12] Guo Haixiang, Li Yijing, Jennifer Shang, Gu Mingyun, Huang Yuanyue, and Gong Bing. Learning from classimbalanced data: Review of methods and applications. Expert Systems with Applications, 73:220–239, 2017.[13] Haibo He and Edwardo A Garcia. Learning from imbalanced data. IEEE Transactions on Knowledge&Data Engineering, (9):1263–1284, 2008.[14] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.[15] JB Heaton, Nicholas G Polson, and Jan Hendrik Witte. Deep learning in finance. arXiv preprint arXiv: 1602.06561, 2016.[16] David Hsu, Gildardo SánchezAnte, and Zheng Sun. Hybrid prm sampling with a cost sensitive adaptive strategy. In Proceedings of the 2005 IEEE international conference on robotics and automation, pages 3874–3880.IEEE, 2005.[17] Anil K Jain, Jianchang Mao, and KM Mohiuddin. Artificial neural networks: A tutorial. Computer, (3):31–44, 1996.[18] Justin M Johnson and Taghi M Khoshgoftaar. Survey on deep learning with class imbalance. Journal of Big Data,6(1):27,2019.165[19] Andrej Karpathy, George Toderici, Sanketh Shetty, Thomas Leung, Rahul Sukthankar, and Li FeiFei. Largescale video classification with convolutional neural networks. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 1725–1732,2014.[20] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, pages 1097–1105,2012.[21] Miroslav Kubat, Robert C Holte, and Stan Matwin. Machine learning for the detection of oil spills in satellite radar images. Machine learning,30(23):195–215, 1998.[22] Matjaz Kukar, Igor Kononenko, et al. Cost-sensitive learning with neural networks. In ECAI, pages 445–449,1998.[23] Yoji Kukita, Junji Uchida, Shigeyuki Oba, Kazumi Nishino, Toru Kumagai, Kazuya Taniguchi, Takako Okuyama, Fumio Imamura, and Kikuya Kato. Quantitative identification of mutant alleles derived from lung cancer in plasma cell-free dna via anomaly detection using deep sequencing data. PloS one,8(11): e81468, 2013.[24] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553): 436,2015.[25] Hansang Lee, Minseok Park, and Junmo Kim. Plankton classification on imbalanced large scale database via convolutional neural networks with transfer learning. In 2016 IEEE international conference on image processing(ICIP), pages 3713–3717.IEEE,2016.[26] Tsung-Yi Lin, Priya Goyal, Ross Girshick, Kaiming He, and Piotr Dollár. Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision, pages 2980–2988,2017.[27] CX Ling and VS Sheng. Cost-sensitive learning and the class imbalance problem. 2011. Encyclopedia of Machine Learning: Springer, 24.[28] Amogh Mahapatra, Nisheeth Srivastava, and Jaideep Srivastava. Contextual anomaly detection in text data. Algorithms,5(4):469–489,2012.[29] Bomin Mao, Zubair Md Fadlullah, Fengxiao Tang, Nei Kato, Osamu Akashi, Takeru Inoue, and Kimihiro Mizutani. Routing or computing? the paradigm shift towards intelligent computer network packet transmission based on deep learning. IEEE Transactions on Computers,66(11):1946–1960,2017.[30] David Masko and Paulina Hensman. The impact of imbalanced training data for convolutional neural networks,2015.[31] P Rahmawati and Prawito Prajitno. Online vibration monitoring of a water pump machine to detect its malfunction components based on artificial neural network. In Journal of Physics: Conference Series, volume 1011, page 012045. IOP Publishing, 2018.[32] R Bharat Rao, Sriram Krishnan, and Radu Stefan Niculescu. Data mining for improved cardiac care. ACM SIGKDD Explorations Newsletter, 8(1):3–10, 2006.[33] Richard G Stafford, Jacob Beutel, et al. Application of neural networks as an aid in medical diagnosis and general anomaly detection, July 19 1994. US Patent 5, 331, 550.[34] David WJ Stein, Scott G Beaven, Lawrence E Hoff, Edwin M Winter, Alan P Schaum, and Alan D Stocker. Anomaly detection from hyperspectral imagery. IEEE signal processing magazine,19(1):58–69,2002.[35] Daniel Svozil, Vladimir Kvasnicka, and Jiri Pospichal. Introduction to multi-layer feed-forward neural networks. Chemometrics and intelligent laboratory systems,39(1):43–62, 1997.[36] Shoujin Wang, Wei Liu, Jia Wu, Longbing Cao, Qinxue Meng, and Paul J Kennedy. Training deep neural networks on imbalanced data sets. In 2016 international joint conference on neural networks(IJCNN), pages 4368–4374.IEEE,2016.[37] Wei Wei, Jinjiu Li, Longbing Cao, Yuming Ou, and Jiahang Chen. Effective detection of sophisticated online banking fraud on extremely imbalanced data. World Wide Web, 16(4): 449–475, 2013.[38] Rui Yan, Yiping Song, and Hua Wu. Learning to respond with deep neural networks for retrieval-based human-computer conversation system. In Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval, pages 55–64. ACM, 2016.[39] Ke Zhang, Jianwu Xu, Martin Renqiang Min, Guofei Jiang, Konstantinos Pelechrinis,and Hui Zhang. Automated it system failure prediction: A deep learning approach. In 2016 IEEE International Conferenceon Big Data(Big Data), pages 1291–1300.IEEE,2016.[40] ZhiHua Zhou and XuYing Liu. Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Transactions on Knowledge & Data Engineering, (1):63–77, 2006. 描述 碩士
國立政治大學
應用數學系
105751009資料來源 http://thesis.lib.nccu.edu.tw/record/#G0105751009 資料類型 thesis dc.contributor.advisor 蔡炎龍 zh_TW dc.contributor.author (Authors) 蔡承孝 zh_TW dc.contributor.author (Authors) Tsai, Cheng-Hsiao en_US dc.creator (作者) 蔡承孝 zh_TW dc.creator (作者) Tsai, Cheng-Hsiao en_US dc.date (日期) 2019 en_US dc.date.accessioned 3-Oct-2019 17:17:29 (UTC+8) - dc.date.available 3-Oct-2019 17:17:29 (UTC+8) - dc.date.issued (上傳時間) 3-Oct-2019 17:17:29 (UTC+8) - dc.identifier (Other Identifiers) G0105751009 en_US dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/126578 - dc.description (描述) 碩士 zh_TW dc.description (描述) 國立政治大學 zh_TW dc.description (描述) 應用數學系 zh_TW dc.description (描述) 105751009 zh_TW dc.description.abstract (摘要) 本文旨在回顧利用深度學習處理不平衡數據集和異常偵測的方法,我們 從 MNIST 生成兩個高度不平衡數據集,不平衡比率高達 2500 並應用在多 元分類任務跟二元分類任務上,在二元分類任務中第 0 類為少數類;而在 多元分類任務中少數類為第 0、1、4、6、7 類,我們利用卷積神機網路來 訓練我們的模型。在異常偵測方面,我們用預先訓練好的手寫辨識 CNN 模 型來判斷其他 18 張貓狗的圖片是否為手寫辨識圖片。由於數據的高度不平衡,原始分類模型的表現不盡理想。因此,在不同 的分類任務上,我們分別利用 6 個和 7 個不同的方法來調整我們的模型。我 們發現新的損失函數 Focalloss 在多元分類任務表現最好,而在二元分類任務中隨機過採樣的表現最佳,但是成本敏感學習的方法並不適用於我們所生成的不平衡數據集。我們利用信心估計讓分類器成功判斷所有貓狗圖片皆不是手寫辨識圖片。 zh_TW dc.description.abstract (摘要) This paper is a survey on deep learning with imbalanced data sets and anomaly detection. We create two imbalanced data sets from MNIST for multi-classification task with minority classes 0,1,4,6,7 and binary classification task with minority class 0. Our data sets are highly imbalanced with imbalanced rate ρ = 2500 and we use convolutional neural network(CNN) for training. In anomaly detection,we use the pretrained CNN handwriting classifier to decide the 18 cat and dog pictures are handwriting pictures or not.Due to the data set is imbalanced, the baseline model have poor performance on minority classes. Hence, we use 6 and 7 different methods to adjust our model. We find that the focal loss function and random over-sampling(ROS) have best performance on multi-classification task and binary classification task on our imbalanced data sets but the cost sensitive learning method is not suitable for our imbalanced data sets. By confidence estimation, our classifier successfully judge all the pictures of cat and dog are not handwriting picture. en_US dc.description.tableofcontents 1.Introduction 12. Deep Learning 32.1 Neurons and Neural Networks 42.2 Activation Function 72.3 Loss Function 92.4 Gradient Descent Method 103. Convolutional Neural Network(CNN) 113.1 Convolutional Layer 123.2 Max Pooling Layer 124. Abnormal Condition and Imbalanced Data Set 144.1 Abnormal Condition 144.2 Imbalanced Data Set 155. Anomaly Detection 175.1 Confidence Estimation 175.2 Gaussian Distribution 185.3 Model for Confidence Estimation 206. Methods for Imbalanced Data Problem 236.1 Data‑level Methods 236.1.1 Random over-sampling(ROS) 236.1.2 Synthetic Minority Over-sampling Technique(SMOTE)246.1.3 Random under-sampling(RUS) 256.2 Algorithm‑level Methods 266.2.1 Mean false error(MFE) 266.2.2 Mean squared false error(MSFE) 276.2.3 Focal loss 286.2.4 Cost sensitive learning 307. Experiment for Multi-classification Task 327.1 Baseline Model 337.2 Random Over-sampling Model 357.3 Synthetic Minority Over-sampling Technique Model 367.4 Random Under-sampling Model 377.5 Mean False Error Model 387.6 Focal Loss Model 397.7 Cost Sensitive Learning Model 427.8 Result for Multi-classification Task 438. Experiment for Binary Classification Task 458.1 Baseline Model 458.2 Random Over-sampling Model 468.3 Synthetic Minority Over-sampling Technique Model 478.4 Random Under-sampling Model 488.5 Mean False Error Model 488.6 Mean Squared False Error Model 498.7 Focal Loss Model 508.8 Cost Sensitive Learning Model 528.9 Result for Multi-classification Task 539. Conclusion 559.1 Contribution 559.2 Future Work 55Appendix A Python Code 56A.1 Baseline Model 56A.2 Random Over-sampling Model 68A.3 Synthetic Minority Over-sampling Technique Model 81A.4 Random Under-sampling Model 100A.5 Mean False Error Model 113A.6 Focal Loss Model 125A.7 Cost Sensitive Learning Model 138A.8 Mean Squared False Error Model 145A.9 Anomaly Detection Model 154Bibliography 164 zh_TW dc.format.extent 3698886 bytes - dc.format.mimetype application/pdf - dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0105751009 en_US dc.subject (關鍵詞) 深度學習 zh_TW dc.subject (關鍵詞) 卷積神經網路 zh_TW dc.subject (關鍵詞) 不平衡數據集 zh_TW dc.subject (關鍵詞) 異常偵測 zh_TW dc.subject (關鍵詞) 圖像分類 zh_TW dc.subject (關鍵詞) Deep Learning en_US dc.subject (關鍵詞) CNN en_US dc.subject (關鍵詞) Imbalanced Data Sets en_US dc.subject (關鍵詞) Anomaly Detection en_US dc.subject (關鍵詞) Image Classification en_US dc.title (題名) 深度學習在不平衡數據集之研究 zh_TW dc.title (題名) Survey on Deep Learning with Imbalanced Data Sets en_US dc.type (資料類型) thesis en_US dc.relation.reference (參考文獻) [1] Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014.[2] Mateusz Buda, Atsuto Maki, and Maciej A Mazurowski. A systematic study of the class imbalance problem in convolutional neural networks. Neural Networks, 106:249–259, 2018.[3] MHB Carvalho, ML Brizot, LM Lopes, CH Chiba,S Miyadahira, and M Zugaib. Detection of fetal structural abnormalities at the 11–14 week ultrasound scan. Prenatal Diagnosis: Published in Affiliation With the International Society for Prenatal Diagnosis, 22(1):1–4, 2002.[4] Varun Chandola, Arindam Banerjee, and Vipin Kumar. Anomaly detection: A survey. ACM computing surveys(CSUR), 41(3):15, 2009.[5] Nitesh V Chawla, KevinW Bowyer, Lawrence OHall, and W Philip Kegelmeyer. Smote: synthetic minority over-sampling technique. Journal of artificial intelligence research, 16:321–357, 2002.[6] Edward Choi, Andy Schuetz, Walter F Stewart, and Jimeng Sun. Using recurrent neural network models for early detection of heart failure onset. Journal of the American Medical Informatics Association, 24(2):361–370, 2016.[7] David A Cieslak, Nitesh V Chawla, and Aaron Striegel. Combating imbalance in network intrusion datasets. In GrC, pages 732–737, 2006.[8] Ronan Collobert and Jason Weston. A unified architecture for natural language processing: Deep neural networks with multitask learning. In Proceedings of the 25th international conference on Machine learning, pages 160–167. ACM, 2008.[9] MJ Desforges, PJ Jacob, and JE Cooper. Applications of probability density estimation to the detection of abnormal conditions in engineering. Proceedings of the Institution of Mechanical Engineers, PartC: Journal of Mechanical Engineering Science, 212(8):687– 703,1998.[10] Chris Drummond,Robert CHolte, et al. C4. 5, class imbalance, and cost sensitivity: why under-sampling beats over-sampling. In Workshop on learning from imbalanced datasets II, volume 11, pages 1–8. Citeseer, 2003.[11] CharlesElkan. The foundations of cost-sensitive learning. In International joint conference on artificial intelligence, volume 17, pages 973–978. Lawrence Erlbaum Associates Ltd, 2001.[12] Guo Haixiang, Li Yijing, Jennifer Shang, Gu Mingyun, Huang Yuanyue, and Gong Bing. Learning from classimbalanced data: Review of methods and applications. Expert Systems with Applications, 73:220–239, 2017.[13] Haibo He and Edwardo A Garcia. Learning from imbalanced data. IEEE Transactions on Knowledge&Data Engineering, (9):1263–1284, 2008.[14] Kaiming He, Xiangyu Zhang, Shaoqing Ren, and Jian Sun. Deep residual learning for image recognition. In Proceedings of the IEEE conference on computer vision and pattern recognition, pages 770–778, 2016.[15] JB Heaton, Nicholas G Polson, and Jan Hendrik Witte. Deep learning in finance. arXiv preprint arXiv: 1602.06561, 2016.[16] David Hsu, Gildardo SánchezAnte, and Zheng Sun. Hybrid prm sampling with a cost sensitive adaptive strategy. In Proceedings of the 2005 IEEE international conference on robotics and automation, pages 3874–3880.IEEE, 2005.[17] Anil K Jain, Jianchang Mao, and KM Mohiuddin. Artificial neural networks: A tutorial. Computer, (3):31–44, 1996.[18] Justin M Johnson and Taghi M Khoshgoftaar. Survey on deep learning with class imbalance. Journal of Big Data,6(1):27,2019.165[19] Andrej Karpathy, George Toderici, Sanketh Shetty, Thomas Leung, Rahul Sukthankar, and Li FeiFei. Largescale video classification with convolutional neural networks. In Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pages 1725–1732,2014.[20] Alex Krizhevsky, Ilya Sutskever, and Geoffrey E Hinton. Imagenet classification with deep convolutional neural networks. In Advances in neural information processing systems, pages 1097–1105,2012.[21] Miroslav Kubat, Robert C Holte, and Stan Matwin. Machine learning for the detection of oil spills in satellite radar images. Machine learning,30(23):195–215, 1998.[22] Matjaz Kukar, Igor Kononenko, et al. Cost-sensitive learning with neural networks. In ECAI, pages 445–449,1998.[23] Yoji Kukita, Junji Uchida, Shigeyuki Oba, Kazumi Nishino, Toru Kumagai, Kazuya Taniguchi, Takako Okuyama, Fumio Imamura, and Kikuya Kato. Quantitative identification of mutant alleles derived from lung cancer in plasma cell-free dna via anomaly detection using deep sequencing data. PloS one,8(11): e81468, 2013.[24] Yann LeCun, Yoshua Bengio, and Geoffrey Hinton. Deep learning. nature, 521(7553): 436,2015.[25] Hansang Lee, Minseok Park, and Junmo Kim. Plankton classification on imbalanced large scale database via convolutional neural networks with transfer learning. In 2016 IEEE international conference on image processing(ICIP), pages 3713–3717.IEEE,2016.[26] Tsung-Yi Lin, Priya Goyal, Ross Girshick, Kaiming He, and Piotr Dollár. Focal loss for dense object detection. In Proceedings of the IEEE international conference on computer vision, pages 2980–2988,2017.[27] CX Ling and VS Sheng. Cost-sensitive learning and the class imbalance problem. 2011. Encyclopedia of Machine Learning: Springer, 24.[28] Amogh Mahapatra, Nisheeth Srivastava, and Jaideep Srivastava. Contextual anomaly detection in text data. Algorithms,5(4):469–489,2012.[29] Bomin Mao, Zubair Md Fadlullah, Fengxiao Tang, Nei Kato, Osamu Akashi, Takeru Inoue, and Kimihiro Mizutani. Routing or computing? the paradigm shift towards intelligent computer network packet transmission based on deep learning. IEEE Transactions on Computers,66(11):1946–1960,2017.[30] David Masko and Paulina Hensman. The impact of imbalanced training data for convolutional neural networks,2015.[31] P Rahmawati and Prawito Prajitno. Online vibration monitoring of a water pump machine to detect its malfunction components based on artificial neural network. In Journal of Physics: Conference Series, volume 1011, page 012045. IOP Publishing, 2018.[32] R Bharat Rao, Sriram Krishnan, and Radu Stefan Niculescu. Data mining for improved cardiac care. ACM SIGKDD Explorations Newsletter, 8(1):3–10, 2006.[33] Richard G Stafford, Jacob Beutel, et al. Application of neural networks as an aid in medical diagnosis and general anomaly detection, July 19 1994. US Patent 5, 331, 550.[34] David WJ Stein, Scott G Beaven, Lawrence E Hoff, Edwin M Winter, Alan P Schaum, and Alan D Stocker. Anomaly detection from hyperspectral imagery. IEEE signal processing magazine,19(1):58–69,2002.[35] Daniel Svozil, Vladimir Kvasnicka, and Jiri Pospichal. Introduction to multi-layer feed-forward neural networks. Chemometrics and intelligent laboratory systems,39(1):43–62, 1997.[36] Shoujin Wang, Wei Liu, Jia Wu, Longbing Cao, Qinxue Meng, and Paul J Kennedy. Training deep neural networks on imbalanced data sets. In 2016 international joint conference on neural networks(IJCNN), pages 4368–4374.IEEE,2016.[37] Wei Wei, Jinjiu Li, Longbing Cao, Yuming Ou, and Jiahang Chen. Effective detection of sophisticated online banking fraud on extremely imbalanced data. World Wide Web, 16(4): 449–475, 2013.[38] Rui Yan, Yiping Song, and Hua Wu. Learning to respond with deep neural networks for retrieval-based human-computer conversation system. In Proceedings of the 39th International ACM SIGIR conference on Research and Development in Information Retrieval, pages 55–64. ACM, 2016.[39] Ke Zhang, Jianwu Xu, Martin Renqiang Min, Guofei Jiang, Konstantinos Pelechrinis,and Hui Zhang. Automated it system failure prediction: A deep learning approach. In 2016 IEEE International Conferenceon Big Data(Big Data), pages 1291–1300.IEEE,2016.[40] ZhiHua Zhou and XuYing Liu. Training cost-sensitive neural networks with methods addressing the class imbalance problem. IEEE Transactions on Knowledge & Data Engineering, (1):63–77, 2006. zh_TW dc.identifier.doi (DOI) 10.6814/NCCU201901175 en_US