學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

題名 應用標籤鑲嵌樹架構於解決多元分類問題
Label Embedding Tree for Multi-class Classification
作者 林威均
Lin, Wei-Chun
貢獻者 周珮婷<br>黃佳慧
Chou, Pei-Ting<br>Huang, Chia-Hui
林威均
Lin, Wei-Chun
關鍵詞 機器學習
多元分類
多元轉二元分類
Machine learning
Multi-class classification
Multi-class to binary classification
日期 2021
上傳時間 1-Feb-2021 13:59:51 (UTC+8)
摘要 在監督式的機器學習中,多類別的分類是指具有兩個以上類別的分類任務,並把每個樣本標記為其中一個類別,由於目前較常使用的多分類方法通常都對資料母體分配有所假設,或是調參較為複雜耗時,因此想要提出一個不需要母體假設,而且調參相對容易的多分類方法。本次研究所提出的方法,透過定義並計算多類別資料中,類別標籤之間的距離矩陣,以此對類別標籤進行階層式的分群,達到拆解多元分類問題的目的,然後利用這個階層樹的架構,對未分類的樣本進行多個無須資料母體假設,基於偽概似的二元分類,最終得到分類結果。本研究將所提出的分類方法應用於不同的數據集中,並與其他常見的多元分類方法進行比較,發現在不同指標下有較高的精確度,另外,本研究更進一步利用基於相互熵篩選的變數子集合提出一個多階段分類方法,發現分類準確度在連續型的數據中有所提升。
In supervised machine learning, multi-class classification refers to a classification task with more than two categories, and each sample is marked as one of the categories. Since the commonly used multi-classification methods usually have assumptions about the distribution of data populations, or the adjustment of hyperparameters is complicated and time-consuming, we want to propose a method that does not require a population assumption and is relatively easy to adjust hyperparameters. This proposed method dismantling multiple classification problems into binary classification problems by defining and calculating the distance matrix between the category labels in the multi-class data, making a hierarchical tree between different label to disassemble the multiple classification problem, and then based on the structure of this hierarchical tree, perform multiple pseudo-likelihood binary classification on unclassified samples, and get the classification results. In this research, the target method is applied into different data sets, and compared with other common multivariate classification methods, the accuracy and macro F1 score of our target method is quite good. In addition, we propose a multi-step method to improve the classification result with the variable chosen by mutual entropy, and the result of test dataset is indeed improved.
參考文獻 Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multi-class to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141.

Anthony, G., Gregg, H., & Tshilidzi, M. (2007). Image classification using SVMs: one-against-one vs one-against-all. arXiv preprint arXiv:0711.2914.

Baloochian, H., & Ghaffary, H. R. (2019). Multi-class Classification Based on Multi- criteria Decision-making. Journal of Classification, 36(1), 140-151.

Bouazizi, M., & Ohtsuki, T. (2016, May). Sentiment analysis: From binary to multi- class classification: A pattern-based approach for multi-class sentiment analysis in Twitter. In 2016 IEEE International Conference on Communications (ICC) (pp. 1-6). IEEE.

Breiman, L., Friedman, J., Stone, C. J., & Olshen, R. A. (1984). Classification and regression trees. CRC press.

Breiman, L. (2001). Random forests. Machine learning, 45(1), 5-32.

Casasent, D., & Wang, Y. C. (2005, July). Automatic target recognition using new support vector machine. In Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005. (Vol. 1, pp. 84-89). IEEE.

Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine l earning, 20(3), 273-297.

Charytanowicz, M., Niewczas, J., Kulczycki, P., Kowalski, P. A., Łukasik, S., & Żak, S. (2010). Complete gradient clustering algorithm for features analysis of x-ray images. In Information technologies in biomedicine (pp. 15-24). Springer, Berlin, Heidelberg.

Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1), 21-27.

Cox, D. R. (1958). The regression analysis of binary sequences. Journal of the Royal Statistical Society: Series B (Methodological), 20(2), 215-232.

Crammer, K., & Singer, Y. (2002). On the learnability and design of output codes for multi-class problems. Machine learning, 47(2-3), 201-233.

Dietterich, T. G., & Bakiri, G. (1995). Solving multi-class learning problems via error-correcting output codes. CoRR. arXiv preprint cs.AI/9501101, 66.

Duda, R. O., Hart, P. E., & Stork, D. G. (2012). Pattern classification. John Wiley & Sons.

Hastie, T., Rosset, S., Zhu, J., & Zou, H. (2009). Multi-class adaboost. Statistics and its Interface, 2(3), 349-360.

Farooq, A., Anwar, S., Awais, M., & Rehman, S. (2017, October). A deep CNN based multi-class classification of Alzheimer`s disease using MRI. In 2017 IEEE International Conference on Imaging systems and techniques (IST) (pp. 1-6). IEEE.

Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of eugenics, 7(2), 179-188.

Fushing, H., Liu, S. Y., Hsieh, Y. C., & McCowan, B. (2018). From patterned response dependency to structured covariate dependency: Entropy based categorical-pattern-matching. PloS one, 13(6), e0198253.

Hafner, M., Kwitt, R., Wrba, F., Gangl, A., Vécsei, A., & Uhl, A. (2008). One- against-one classification for zoom-endoscopy images.

Hastie, T., & Tibshirani, R. (1997). Classification by pairwise coupling. Advances in neural information processing systems, 10, 507-513.

Hsieh, F., & Chou, E. P. (2020). Categorical Exploratory Data Analysis: From Multi-class Classification and Response Manifold Analytics perspectives of baseball pitching dynamics. arXiv preprint arXiv:2006.14411.

Kang, S., Cho, S., & Kang, P. (2015). Constructing a multi-class classifier using one- against-one approach with different binary classifiers. Neurocomputing, 149, 677-682.

Kim, K. I., Jung, K., Park, S. H., & Kim, H. J. (2002). Support vector machines for texture classification. IEEE transactions on pattern analysis and machine intelligence, 24(11), 1542-1550.

La Cava, W., Silva, S., Danai, K., Spector, L., Vanneschi, L., & Moore, J. H. (2019). Multidimensional genetic programming for multi-class classification. Swarm and evolutionary computation, 44, 260-272.

Lei, H., & Govindaraju, V. (2005, June). Half-against-half multi-class support vector machines. In International Workshop on Multiple Classifier Systems (pp. 156-164). Springer, Berlin, Heidelberg.

Pal, M., & Mather, P. M. (2005). Support vector machines for classification in remote sensing. International journal of remote sensing, 26(5), 1007-1011.

Quinlan, J. R. (1986). Induction of Decision Trees. Mach. Learn.

Quinlan, J. R. (1993). C4. 5: programs for machine learning. Elsevier.

Rajab, A., Huang, C. T., Al-Shargabi, M., & Cobb, J. (2016, November). Countering burst header packet flooding attack in optical burst switching network. In International Conference on Information Security Practice and Experience (pp. 315-329). Springer, Cham.

Schwenker, F., & Palm, G. (2001, July). Tree-structured support vector machines for multi-class pattern recognition. In International Workshop on Multiple Classifier Systems (pp. 409-417). Springer, Berlin, Heidelberg.
描述 碩士
國立政治大學
統計學系
108354005
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0108354005
資料類型 thesis
dc.contributor.advisor 周珮婷<br>黃佳慧zh_TW
dc.contributor.advisor Chou, Pei-Ting<br>Huang, Chia-Huien_US
dc.contributor.author (Authors) 林威均zh_TW
dc.contributor.author (Authors) Lin, Wei-Chunen_US
dc.creator (作者) 林威均zh_TW
dc.creator (作者) Lin, Wei-Chunen_US
dc.date (日期) 2021en_US
dc.date.accessioned 1-Feb-2021 13:59:51 (UTC+8)-
dc.date.available 1-Feb-2021 13:59:51 (UTC+8)-
dc.date.issued (上傳時間) 1-Feb-2021 13:59:51 (UTC+8)-
dc.identifier (Other Identifiers) G0108354005en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/133844-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 統計學系zh_TW
dc.description (描述) 108354005zh_TW
dc.description.abstract (摘要) 在監督式的機器學習中,多類別的分類是指具有兩個以上類別的分類任務,並把每個樣本標記為其中一個類別,由於目前較常使用的多分類方法通常都對資料母體分配有所假設,或是調參較為複雜耗時,因此想要提出一個不需要母體假設,而且調參相對容易的多分類方法。本次研究所提出的方法,透過定義並計算多類別資料中,類別標籤之間的距離矩陣,以此對類別標籤進行階層式的分群,達到拆解多元分類問題的目的,然後利用這個階層樹的架構,對未分類的樣本進行多個無須資料母體假設,基於偽概似的二元分類,最終得到分類結果。本研究將所提出的分類方法應用於不同的數據集中,並與其他常見的多元分類方法進行比較,發現在不同指標下有較高的精確度,另外,本研究更進一步利用基於相互熵篩選的變數子集合提出一個多階段分類方法,發現分類準確度在連續型的數據中有所提升。zh_TW
dc.description.abstract (摘要) In supervised machine learning, multi-class classification refers to a classification task with more than two categories, and each sample is marked as one of the categories. Since the commonly used multi-classification methods usually have assumptions about the distribution of data populations, or the adjustment of hyperparameters is complicated and time-consuming, we want to propose a method that does not require a population assumption and is relatively easy to adjust hyperparameters. This proposed method dismantling multiple classification problems into binary classification problems by defining and calculating the distance matrix between the category labels in the multi-class data, making a hierarchical tree between different label to disassemble the multiple classification problem, and then based on the structure of this hierarchical tree, perform multiple pseudo-likelihood binary classification on unclassified samples, and get the classification results. In this research, the target method is applied into different data sets, and compared with other common multivariate classification methods, the accuracy and macro F1 score of our target method is quite good. In addition, we propose a multi-step method to improve the classification result with the variable chosen by mutual entropy, and the result of test dataset is indeed improved.en_US
dc.description.tableofcontents 第一章 緒論 8
第二章 文獻探討 10
第一節 多元分類的研究 10
第二節 基於分類器 11
第三節 基於多分類轉二分類的方法 12
第四節、小結 15
第三章 研究方法 15
第一節 標籤鑲嵌樹 15
第二節 以偽概似為基礎的二元分類器 19
第三節 模型建置與分類流程 19
第四節 變數篩選方法 21
第五節 分類問題研究第一階段流程 24
第六節 分類改進的方法 24
第四章 資料介紹 25
一、Glass Dataset 25
二、Burst Header Packet (BHP) flooding attack on Optical Burst Switching (OBS) Network Data Set 26
三、Seeds Dataset 29
四、Wine Dataset 30
五、Zoo Dataset 31
六、Iris dataset 32
七、Vertebral Column Data Set 33
八、Energy efficiency Data Set 34
九、Image Segmentation Data Set 35
第五章 研究結果 37
第一節 第一階段資料結果 37
一、Glass Dataset 39
二、Zoo Dataset 41
三、Energy Dataset 44
四、Segment Dataset 45
第二節 分類改進結果 48
第三節 結論 50
第六章 未來方向與展望 51
第七章 參考文獻 52
zh_TW
dc.format.extent 3330849 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0108354005en_US
dc.subject (關鍵詞) 機器學習zh_TW
dc.subject (關鍵詞) 多元分類zh_TW
dc.subject (關鍵詞) 多元轉二元分類zh_TW
dc.subject (關鍵詞) Machine learningen_US
dc.subject (關鍵詞) Multi-class classificationen_US
dc.subject (關鍵詞) Multi-class to binary classificationen_US
dc.title (題名) 應用標籤鑲嵌樹架構於解決多元分類問題zh_TW
dc.title (題名) Label Embedding Tree for Multi-class Classificationen_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multi-class to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141.

Anthony, G., Gregg, H., & Tshilidzi, M. (2007). Image classification using SVMs: one-against-one vs one-against-all. arXiv preprint arXiv:0711.2914.

Baloochian, H., & Ghaffary, H. R. (2019). Multi-class Classification Based on Multi- criteria Decision-making. Journal of Classification, 36(1), 140-151.

Bouazizi, M., & Ohtsuki, T. (2016, May). Sentiment analysis: From binary to multi- class classification: A pattern-based approach for multi-class sentiment analysis in Twitter. In 2016 IEEE International Conference on Communications (ICC) (pp. 1-6). IEEE.

Breiman, L., Friedman, J., Stone, C. J., & Olshen, R. A. (1984). Classification and regression trees. CRC press.

Breiman, L. (2001). Random forests. Machine learning, 45(1), 5-32.

Casasent, D., & Wang, Y. C. (2005, July). Automatic target recognition using new support vector machine. In Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005. (Vol. 1, pp. 84-89). IEEE.

Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine l earning, 20(3), 273-297.

Charytanowicz, M., Niewczas, J., Kulczycki, P., Kowalski, P. A., Łukasik, S., & Żak, S. (2010). Complete gradient clustering algorithm for features analysis of x-ray images. In Information technologies in biomedicine (pp. 15-24). Springer, Berlin, Heidelberg.

Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1), 21-27.

Cox, D. R. (1958). The regression analysis of binary sequences. Journal of the Royal Statistical Society: Series B (Methodological), 20(2), 215-232.

Crammer, K., & Singer, Y. (2002). On the learnability and design of output codes for multi-class problems. Machine learning, 47(2-3), 201-233.

Dietterich, T. G., & Bakiri, G. (1995). Solving multi-class learning problems via error-correcting output codes. CoRR. arXiv preprint cs.AI/9501101, 66.

Duda, R. O., Hart, P. E., & Stork, D. G. (2012). Pattern classification. John Wiley & Sons.

Hastie, T., Rosset, S., Zhu, J., & Zou, H. (2009). Multi-class adaboost. Statistics and its Interface, 2(3), 349-360.

Farooq, A., Anwar, S., Awais, M., & Rehman, S. (2017, October). A deep CNN based multi-class classification of Alzheimer`s disease using MRI. In 2017 IEEE International Conference on Imaging systems and techniques (IST) (pp. 1-6). IEEE.

Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of eugenics, 7(2), 179-188.

Fushing, H., Liu, S. Y., Hsieh, Y. C., & McCowan, B. (2018). From patterned response dependency to structured covariate dependency: Entropy based categorical-pattern-matching. PloS one, 13(6), e0198253.

Hafner, M., Kwitt, R., Wrba, F., Gangl, A., Vécsei, A., & Uhl, A. (2008). One- against-one classification for zoom-endoscopy images.

Hastie, T., & Tibshirani, R. (1997). Classification by pairwise coupling. Advances in neural information processing systems, 10, 507-513.

Hsieh, F., & Chou, E. P. (2020). Categorical Exploratory Data Analysis: From Multi-class Classification and Response Manifold Analytics perspectives of baseball pitching dynamics. arXiv preprint arXiv:2006.14411.

Kang, S., Cho, S., & Kang, P. (2015). Constructing a multi-class classifier using one- against-one approach with different binary classifiers. Neurocomputing, 149, 677-682.

Kim, K. I., Jung, K., Park, S. H., & Kim, H. J. (2002). Support vector machines for texture classification. IEEE transactions on pattern analysis and machine intelligence, 24(11), 1542-1550.

La Cava, W., Silva, S., Danai, K., Spector, L., Vanneschi, L., & Moore, J. H. (2019). Multidimensional genetic programming for multi-class classification. Swarm and evolutionary computation, 44, 260-272.

Lei, H., & Govindaraju, V. (2005, June). Half-against-half multi-class support vector machines. In International Workshop on Multiple Classifier Systems (pp. 156-164). Springer, Berlin, Heidelberg.

Pal, M., & Mather, P. M. (2005). Support vector machines for classification in remote sensing. International journal of remote sensing, 26(5), 1007-1011.

Quinlan, J. R. (1986). Induction of Decision Trees. Mach. Learn.

Quinlan, J. R. (1993). C4. 5: programs for machine learning. Elsevier.

Rajab, A., Huang, C. T., Al-Shargabi, M., & Cobb, J. (2016, November). Countering burst header packet flooding attack in optical burst switching network. In International Conference on Information Security Practice and Experience (pp. 315-329). Springer, Cham.

Schwenker, F., & Palm, G. (2001, July). Tree-structured support vector machines for multi-class pattern recognition. In International Workshop on Multiple Classifier Systems (pp. 409-417). Springer, Berlin, Heidelberg.
zh_TW
dc.identifier.doi (DOI) 10.6814/NCCU202100115en_US