學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 機器學習分類方法DCG 與其他方法比較(以紅酒為例)
A supervised learning study of comparison between DCG tree and other machine learning methods in a wine quality dataset
作者 楊俊隆
Yang, Jiun Lung
貢獻者 周珮婷
Chou, Pei Ting
楊俊隆
Yang, Jiun Lung
關鍵詞 監督式學習
非監督式學習
加權資料雲幾何樹
Supervised learning
Unsupervised learning
WDCG
日期 2017
上傳時間 24-Jul-2017 11:58:59 (UTC+8)
摘要 隨著大數據時代來臨,機器學習方法已然成為熱門學習的主題,主要分為監督式學習與非監督式學習,亦即分類與分群。本研究以羅吉斯迴歸配適結果加權距離矩陣,以資料雲幾何樹分群法為主,在含有類別變數的紅酒資料中,透過先分群再分類的方式,判斷是否可以得到更佳的預測結果。並比較監督式學習下各種機器學習方法預測表現,及非監督式學習下後再透過分類器方法的預測表現。在內容的排序上,首先介紹常見的分類與分群演算方法,並分析其優缺點與假設限制,接著將介紹資料雲幾何樹演算法,並詳述執行步驟。最後再引入加權資料雲幾何樹演算法,將權重的觀點應用在資料雲幾何樹演算法中,透過紅酒資料,比較各種分類與分群方法的預測準確率。
Machine learning has become a popular topic since the coming of big data era. Machine learning algorithms are often categorized as being supervised or unsupervised, namely classification or clustering methods. In this study, first, we introduced the advantages, disadvantages, and limits of traditional classification and clustering algorithms. Next, we introduced DCG-tree and WDCG algorithms. We extended the idea of WDCG to the cases with label size=3. The distance matrix was modified by the fitted results of logistic regression. Lastly, by using a real wine dataset, we then compared the performance of WDCG with the performance of traditional classification methodologies. The study showed that using unsupervised learning algorithm with logistic regression as a classifier performs better than using only the traditional classification methods.
參考文獻 Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141.
Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on Computational learning theory (pp. 144-152). ACM.
Chakraborty, S. (2005). Bayesian machine learning. University of Florida.
Chou, E. P., Hsieh, F., & Capitanio, J. (2013). Computed Data-Geometry Based Supervised and Semi-supervised Learning in High Dimensional Data. In Machine Learning and Applications (ICMLA), 2013 12th International Conference on (Vol. 1, pp. 277-282). IEEE.
Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297.
Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547-553.
Dietterich, T. G. (1997). Machine-learning research. AI magazine, 18(4), 97.
Filzmoser, P., Baumgartner, R., & Moser, E. (1999). A hierarchical clustering method for analyzing functional MR images. Magnetic resonance imaging, 17(6), 817-826.
Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of human genetics, 7(2), 179-188.
Fushing, H., & McAssey, M. P. (2010). Time, temperature, and data cloud geometry. Physical Review E, 82(6), 061110.
Fushing, H., Wang, H., VanderWaal, K., McCowan, B., & Koehl, P. (2013). Multi-scale clustering by building a robust and self correcting ultrametric topology on data points. PloS one, 8(2), e56259.
Hartigan, J. A., & Wong, M. A. (1979). Algorithm AS 136: A k-means clustering algorithm. Journal of the Royal Statistical Society. Series C (Applied Statistics), 28(1), 100-108.
Hastie, T., & Tibshirani, R. (1998). Classification by pairwise coupling. In Advances in neural information processing systems (pp. 507-513).
Johnson, S. C. (1967). Hierarchical clustering schemes. Psychometrika, 32(3), 241-254.
Kotsiantis, S. B., Zaharakis, I. D., & Pintelas, P. E. (2006). Machine learning: a review of classification and combining techniques. Artificial Intelligence Review, 26(3), 159-190.
Peng, C. Y. J., Lee, K. L., & Ingersoll, G. M. (2002). An introduction to logistic regression analysis and reporting. The journal of educational research, 96(1), 3-14.
Pereira, F., Mitchell, T., & Botvinick, M. (2009). Machine learning classifiers and fMRI: a tutorial overview. Neuroimage, 45(1), S199-S209.
Sharan, R. V., & Moir, T. J. (2014). Comparison of multiclass SVM classification techniques in an audio surveillance application under mismatched conditions. In Digital Signal Processing (DSP), 2014 19th International Conference on (pp. 83-88). IEEE.
描述 碩士
國立政治大學
統計學系
102354015
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0102354015
資料類型 thesis
dc.contributor.advisor 周珮婷zh_TW
dc.contributor.advisor Chou, Pei Tingen_US
dc.contributor.author (Authors) 楊俊隆zh_TW
dc.contributor.author (Authors) Yang, Jiun Lungen_US
dc.creator (作者) 楊俊隆zh_TW
dc.creator (作者) Yang, Jiun Lungen_US
dc.date (日期) 2017en_US
dc.date.accessioned 24-Jul-2017 11:58:59 (UTC+8)-
dc.date.available 24-Jul-2017 11:58:59 (UTC+8)-
dc.date.issued (上傳時間) 24-Jul-2017 11:58:59 (UTC+8)-
dc.identifier (Other Identifiers) G0102354015en_US
dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/111304-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 統計學系zh_TW
dc.description (描述) 102354015zh_TW
dc.description.abstract (摘要) 隨著大數據時代來臨,機器學習方法已然成為熱門學習的主題,主要分為監督式學習與非監督式學習,亦即分類與分群。本研究以羅吉斯迴歸配適結果加權距離矩陣,以資料雲幾何樹分群法為主,在含有類別變數的紅酒資料中,透過先分群再分類的方式,判斷是否可以得到更佳的預測結果。並比較監督式學習下各種機器學習方法預測表現,及非監督式學習下後再透過分類器方法的預測表現。在內容的排序上,首先介紹常見的分類與分群演算方法,並分析其優缺點與假設限制,接著將介紹資料雲幾何樹演算法,並詳述執行步驟。最後再引入加權資料雲幾何樹演算法,將權重的觀點應用在資料雲幾何樹演算法中,透過紅酒資料,比較各種分類與分群方法的預測準確率。zh_TW
dc.description.abstract (摘要) Machine learning has become a popular topic since the coming of big data era. Machine learning algorithms are often categorized as being supervised or unsupervised, namely classification or clustering methods. In this study, first, we introduced the advantages, disadvantages, and limits of traditional classification and clustering algorithms. Next, we introduced DCG-tree and WDCG algorithms. We extended the idea of WDCG to the cases with label size=3. The distance matrix was modified by the fitted results of logistic regression. Lastly, by using a real wine dataset, we then compared the performance of WDCG with the performance of traditional classification methodologies. The study showed that using unsupervised learning algorithm with logistic regression as a classifier performs better than using only the traditional classification methods.en_US
dc.description.tableofcontents 第一章 緒論 6
第一節 研究動機 6
第二節 研究目的 7
第二章 文獻回顧 8
第一節 監督式學習(Supervised Learning) 10
一、支持向量機 (SVM) 13
二、線性判別分析 (LDA ) 14
三、二次曲線判別分析(QDA) 15
四、羅吉斯迴歸(Logistic Regression) 16
第二節 非監督式學習(Unsupervised Learning) 17
一、階層式分群法(HC) 18
二、K均值分群法 (K-means) 19
三、資料雲幾何樹 (DCG-tree) 20
四、WDCG 22
第三章 研究方法 23
第一節 研究流程 23
第二節 研究方法 26
第四章 研究結果 29
第五章 結論 31
參考文獻 33
zh_TW
dc.format.extent 878551 bytes-
dc.format.mimetype application/pdf-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0102354015en_US
dc.subject (關鍵詞) 監督式學習zh_TW
dc.subject (關鍵詞) 非監督式學習zh_TW
dc.subject (關鍵詞) 加權資料雲幾何樹zh_TW
dc.subject (關鍵詞) Supervised learningen_US
dc.subject (關鍵詞) Unsupervised learningen_US
dc.subject (關鍵詞) WDCGen_US
dc.title (題名) 機器學習分類方法DCG 與其他方法比較(以紅酒為例)zh_TW
dc.title (題名) A supervised learning study of comparison between DCG tree and other machine learning methods in a wine quality dataseten_US
dc.type (資料類型) thesisen_US
dc.relation.reference (參考文獻) Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141.
Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on Computational learning theory (pp. 144-152). ACM.
Chakraborty, S. (2005). Bayesian machine learning. University of Florida.
Chou, E. P., Hsieh, F., & Capitanio, J. (2013). Computed Data-Geometry Based Supervised and Semi-supervised Learning in High Dimensional Data. In Machine Learning and Applications (ICMLA), 2013 12th International Conference on (Vol. 1, pp. 277-282). IEEE.
Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297.
Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547-553.
Dietterich, T. G. (1997). Machine-learning research. AI magazine, 18(4), 97.
Filzmoser, P., Baumgartner, R., & Moser, E. (1999). A hierarchical clustering method for analyzing functional MR images. Magnetic resonance imaging, 17(6), 817-826.
Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of human genetics, 7(2), 179-188.
Fushing, H., & McAssey, M. P. (2010). Time, temperature, and data cloud geometry. Physical Review E, 82(6), 061110.
Fushing, H., Wang, H., VanderWaal, K., McCowan, B., & Koehl, P. (2013). Multi-scale clustering by building a robust and self correcting ultrametric topology on data points. PloS one, 8(2), e56259.
Hartigan, J. A., & Wong, M. A. (1979). Algorithm AS 136: A k-means clustering algorithm. Journal of the Royal Statistical Society. Series C (Applied Statistics), 28(1), 100-108.
Hastie, T., & Tibshirani, R. (1998). Classification by pairwise coupling. In Advances in neural information processing systems (pp. 507-513).
Johnson, S. C. (1967). Hierarchical clustering schemes. Psychometrika, 32(3), 241-254.
Kotsiantis, S. B., Zaharakis, I. D., & Pintelas, P. E. (2006). Machine learning: a review of classification and combining techniques. Artificial Intelligence Review, 26(3), 159-190.
Peng, C. Y. J., Lee, K. L., & Ingersoll, G. M. (2002). An introduction to logistic regression analysis and reporting. The journal of educational research, 96(1), 3-14.
Pereira, F., Mitchell, T., & Botvinick, M. (2009). Machine learning classifiers and fMRI: a tutorial overview. Neuroimage, 45(1), S199-S209.
Sharan, R. V., & Moir, T. J. (2014). Comparison of multiclass SVM classification techniques in an audio surveillance application under mismatched conditions. In Digital Signal Processing (DSP), 2014 19th International Conference on (pp. 83-88). IEEE.
zh_TW