Publications-Theses
Article View/Open
Publication Export
-
Google ScholarTM
NCCU Library
Citation Infomation
Related Publications in TAIR
題名 機器學習分類方法DCG 與其他方法比較(以紅酒為例)
A supervised learning study of comparison between DCG tree and other machine learning methods in a wine quality dataset作者 楊俊隆
Yang, Jiun Lung貢獻者 周珮婷
Chou, Pei Ting
楊俊隆
Yang, Jiun Lung關鍵詞 監督式學習
非監督式學習
加權資料雲幾何樹
Supervised learning
Unsupervised learning
WDCG日期 2017 上傳時間 24-Jul-2017 11:58:59 (UTC+8) 摘要 隨著大數據時代來臨,機器學習方法已然成為熱門學習的主題,主要分為監督式學習與非監督式學習,亦即分類與分群。本研究以羅吉斯迴歸配適結果加權距離矩陣,以資料雲幾何樹分群法為主,在含有類別變數的紅酒資料中,透過先分群再分類的方式,判斷是否可以得到更佳的預測結果。並比較監督式學習下各種機器學習方法預測表現,及非監督式學習下後再透過分類器方法的預測表現。在內容的排序上,首先介紹常見的分類與分群演算方法,並分析其優缺點與假設限制,接著將介紹資料雲幾何樹演算法,並詳述執行步驟。最後再引入加權資料雲幾何樹演算法,將權重的觀點應用在資料雲幾何樹演算法中,透過紅酒資料,比較各種分類與分群方法的預測準確率。
Machine learning has become a popular topic since the coming of big data era. Machine learning algorithms are often categorized as being supervised or unsupervised, namely classification or clustering methods. In this study, first, we introduced the advantages, disadvantages, and limits of traditional classification and clustering algorithms. Next, we introduced DCG-tree and WDCG algorithms. We extended the idea of WDCG to the cases with label size=3. The distance matrix was modified by the fitted results of logistic regression. Lastly, by using a real wine dataset, we then compared the performance of WDCG with the performance of traditional classification methodologies. The study showed that using unsupervised learning algorithm with logistic regression as a classifier performs better than using only the traditional classification methods.參考文獻 Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141.Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on Computational learning theory (pp. 144-152). ACM.Chakraborty, S. (2005). Bayesian machine learning. University of Florida.Chou, E. P., Hsieh, F., & Capitanio, J. (2013). Computed Data-Geometry Based Supervised and Semi-supervised Learning in High Dimensional Data. In Machine Learning and Applications (ICMLA), 2013 12th International Conference on (Vol. 1, pp. 277-282). IEEE.Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297.Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547-553.Dietterich, T. G. (1997). Machine-learning research. AI magazine, 18(4), 97.Filzmoser, P., Baumgartner, R., & Moser, E. (1999). A hierarchical clustering method for analyzing functional MR images. Magnetic resonance imaging, 17(6), 817-826.Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of human genetics, 7(2), 179-188.Fushing, H., & McAssey, M. P. (2010). Time, temperature, and data cloud geometry. Physical Review E, 82(6), 061110.Fushing, H., Wang, H., VanderWaal, K., McCowan, B., & Koehl, P. (2013). Multi-scale clustering by building a robust and self correcting ultrametric topology on data points. PloS one, 8(2), e56259.Hartigan, J. A., & Wong, M. A. (1979). Algorithm AS 136: A k-means clustering algorithm. Journal of the Royal Statistical Society. Series C (Applied Statistics), 28(1), 100-108.Hastie, T., & Tibshirani, R. (1998). Classification by pairwise coupling. In Advances in neural information processing systems (pp. 507-513).Johnson, S. C. (1967). Hierarchical clustering schemes. Psychometrika, 32(3), 241-254.Kotsiantis, S. B., Zaharakis, I. D., & Pintelas, P. E. (2006). Machine learning: a review of classification and combining techniques. Artificial Intelligence Review, 26(3), 159-190.Peng, C. Y. J., Lee, K. L., & Ingersoll, G. M. (2002). An introduction to logistic regression analysis and reporting. The journal of educational research, 96(1), 3-14.Pereira, F., Mitchell, T., & Botvinick, M. (2009). Machine learning classifiers and fMRI: a tutorial overview. Neuroimage, 45(1), S199-S209.Sharan, R. V., & Moir, T. J. (2014). Comparison of multiclass SVM classification techniques in an audio surveillance application under mismatched conditions. In Digital Signal Processing (DSP), 2014 19th International Conference on (pp. 83-88). IEEE. 描述 碩士
國立政治大學
統計學系
102354015資料來源 http://thesis.lib.nccu.edu.tw/record/#G0102354015 資料類型 thesis dc.contributor.advisor 周珮婷 zh_TW dc.contributor.advisor Chou, Pei Ting en_US dc.contributor.author (Authors) 楊俊隆 zh_TW dc.contributor.author (Authors) Yang, Jiun Lung en_US dc.creator (作者) 楊俊隆 zh_TW dc.creator (作者) Yang, Jiun Lung en_US dc.date (日期) 2017 en_US dc.date.accessioned 24-Jul-2017 11:58:59 (UTC+8) - dc.date.available 24-Jul-2017 11:58:59 (UTC+8) - dc.date.issued (上傳時間) 24-Jul-2017 11:58:59 (UTC+8) - dc.identifier (Other Identifiers) G0102354015 en_US dc.identifier.uri (URI) http://nccur.lib.nccu.edu.tw/handle/140.119/111304 - dc.description (描述) 碩士 zh_TW dc.description (描述) 國立政治大學 zh_TW dc.description (描述) 統計學系 zh_TW dc.description (描述) 102354015 zh_TW dc.description.abstract (摘要) 隨著大數據時代來臨,機器學習方法已然成為熱門學習的主題,主要分為監督式學習與非監督式學習,亦即分類與分群。本研究以羅吉斯迴歸配適結果加權距離矩陣,以資料雲幾何樹分群法為主,在含有類別變數的紅酒資料中,透過先分群再分類的方式,判斷是否可以得到更佳的預測結果。並比較監督式學習下各種機器學習方法預測表現,及非監督式學習下後再透過分類器方法的預測表現。在內容的排序上,首先介紹常見的分類與分群演算方法,並分析其優缺點與假設限制,接著將介紹資料雲幾何樹演算法,並詳述執行步驟。最後再引入加權資料雲幾何樹演算法,將權重的觀點應用在資料雲幾何樹演算法中,透過紅酒資料,比較各種分類與分群方法的預測準確率。 zh_TW dc.description.abstract (摘要) Machine learning has become a popular topic since the coming of big data era. Machine learning algorithms are often categorized as being supervised or unsupervised, namely classification or clustering methods. In this study, first, we introduced the advantages, disadvantages, and limits of traditional classification and clustering algorithms. Next, we introduced DCG-tree and WDCG algorithms. We extended the idea of WDCG to the cases with label size=3. The distance matrix was modified by the fitted results of logistic regression. Lastly, by using a real wine dataset, we then compared the performance of WDCG with the performance of traditional classification methodologies. The study showed that using unsupervised learning algorithm with logistic regression as a classifier performs better than using only the traditional classification methods. en_US dc.description.tableofcontents 第一章 緒論 6第一節 研究動機 6第二節 研究目的 7第二章 文獻回顧 8第一節 監督式學習(Supervised Learning) 10一、支持向量機 (SVM) 13二、線性判別分析 (LDA ) 14三、二次曲線判別分析(QDA) 15四、羅吉斯迴歸(Logistic Regression) 16第二節 非監督式學習(Unsupervised Learning) 17一、階層式分群法(HC) 18二、K均值分群法 (K-means) 19三、資料雲幾何樹 (DCG-tree) 20四、WDCG 22第三章 研究方法 23第一節 研究流程 23第二節 研究方法 26第四章 研究結果 29第五章 結論 31參考文獻 33 zh_TW dc.format.extent 878551 bytes - dc.format.mimetype application/pdf - dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0102354015 en_US dc.subject (關鍵詞) 監督式學習 zh_TW dc.subject (關鍵詞) 非監督式學習 zh_TW dc.subject (關鍵詞) 加權資料雲幾何樹 zh_TW dc.subject (關鍵詞) Supervised learning en_US dc.subject (關鍵詞) Unsupervised learning en_US dc.subject (關鍵詞) WDCG en_US dc.title (題名) 機器學習分類方法DCG 與其他方法比較(以紅酒為例) zh_TW dc.title (題名) A supervised learning study of comparison between DCG tree and other machine learning methods in a wine quality dataset en_US dc.type (資料類型) thesis en_US dc.relation.reference (參考文獻) Allwein, E. L., Schapire, R. E., & Singer, Y. (2000). Reducing multiclass to binary: A unifying approach for margin classifiers. Journal of machine learning research, 1(Dec), 113-141.Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. In Proceedings of the fifth annual workshop on Computational learning theory (pp. 144-152). ACM.Chakraborty, S. (2005). Bayesian machine learning. University of Florida.Chou, E. P., Hsieh, F., & Capitanio, J. (2013). Computed Data-Geometry Based Supervised and Semi-supervised Learning in High Dimensional Data. In Machine Learning and Applications (ICMLA), 2013 12th International Conference on (Vol. 1, pp. 277-282). IEEE.Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297.Cortez, P., Cerdeira, A., Almeida, F., Matos, T., & Reis, J. (2009). Modeling wine preferences by data mining from physicochemical properties. Decision Support Systems, 47(4), 547-553.Dietterich, T. G. (1997). Machine-learning research. AI magazine, 18(4), 97.Filzmoser, P., Baumgartner, R., & Moser, E. (1999). A hierarchical clustering method for analyzing functional MR images. Magnetic resonance imaging, 17(6), 817-826.Fisher, R. A. (1936). The use of multiple measurements in taxonomic problems. Annals of human genetics, 7(2), 179-188.Fushing, H., & McAssey, M. P. (2010). Time, temperature, and data cloud geometry. Physical Review E, 82(6), 061110.Fushing, H., Wang, H., VanderWaal, K., McCowan, B., & Koehl, P. (2013). Multi-scale clustering by building a robust and self correcting ultrametric topology on data points. PloS one, 8(2), e56259.Hartigan, J. A., & Wong, M. A. (1979). Algorithm AS 136: A k-means clustering algorithm. Journal of the Royal Statistical Society. Series C (Applied Statistics), 28(1), 100-108.Hastie, T., & Tibshirani, R. (1998). Classification by pairwise coupling. In Advances in neural information processing systems (pp. 507-513).Johnson, S. C. (1967). Hierarchical clustering schemes. Psychometrika, 32(3), 241-254.Kotsiantis, S. B., Zaharakis, I. D., & Pintelas, P. E. (2006). Machine learning: a review of classification and combining techniques. Artificial Intelligence Review, 26(3), 159-190.Peng, C. Y. J., Lee, K. L., & Ingersoll, G. M. (2002). An introduction to logistic regression analysis and reporting. The journal of educational research, 96(1), 3-14.Pereira, F., Mitchell, T., & Botvinick, M. (2009). Machine learning classifiers and fMRI: a tutorial overview. Neuroimage, 45(1), S199-S209.Sharan, R. V., & Moir, T. J. (2014). Comparison of multiclass SVM classification techniques in an audio surveillance application under mismatched conditions. In Digital Signal Processing (DSP), 2014 19th International Conference on (pp. 83-88). IEEE. zh_TW