Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/137678
DC FieldValueLanguage
dc.contributor.advisor江玥慧zh_TW
dc.contributor.advisorChiang, Yueh-huien_US
dc.contributor.author林英儒zh_TW
dc.contributor.authorLin, Ying-Ruen_US
dc.creator林英儒zh_TW
dc.creatorLin, Ying-Ruen_US
dc.date2021en_US
dc.date.accessioned2021-11-01T04:00:56Z-
dc.date.available2021-11-01T04:00:56Z-
dc.date.issued2021-11-01T04:00:56Z-
dc.identifierG0108753211en_US
dc.identifier.urihttp://nccur.lib.nccu.edu.tw/handle/140.119/137678-
dc.description碩士zh_TW
dc.description國立政治大學zh_TW
dc.description資訊科學系zh_TW
dc.description108753211zh_TW
dc.description.abstract在面對面的實體教室中,教學現場的人員比較容易觀察學生於課堂中的學習狀況;當學生在學習過程中遇到問題時,也較能清楚地了解問題所在,幫助學生解決問題。不過在課堂以外的時間,教學人員不易得知學生的學習狀況與學習過程。因此,本研究希望透過學習管理系統收集學生在學習過程中的日誌資料(Logs),並使用深度學習模型、時間序列分群方法和序列分析探討學生於課程中的學習表現,最後將研究結果回饋給教學現場的人員,使老師和助教能夠幫助學習進度較緩慢、或是在學習過程中遇到問題的學生。zh_TW
dc.description.abstractIn a face-to-face classroom, it’s easier for teachers to observe students’ learning process. When students encounter problems in class, teachers can see the situations and help students solve the problems. However, it`s not easy for teachers to know students’ learning conditions and learning process outside of class. Therefore, this study collected students’ learning log data from a learning management system, and used deep learning models, time series clustering methods, and sequential analysis to explore students’ learning performance in the courses. The results of this study can contribute to helping teachers identify low-performance students so as to provide necessary assistance.en_US
dc.description.tableofcontents第1章 緒論 1\n1.1 研究背景 1\n1.2 研究動機與目的 2\n1.3 研究問題 3\n1.4 論文架構 3\n第2章 文獻探討 4\n2.1 教育資料探勘 4\n2.2 深度學習 11\n2.3 循環神經網路 15\n2.4 動態時間校正應用於K-means 19\n2.5 序列分析 22\n第3章 研究方法 25\n3.1 資料來源 25\n3.2 實驗架構 25\n3.3 實驗(一)預測學生的學習成效 26\n3.3.1 實驗資料 26\n3.3.2 資料前處理 27\n3.3.3 建立深度神經網路模型 30\n3.3.4 評估指標 32\n3.4 實驗(二)繳交個人平時作業的特徵分群 32\n3.4.1 實驗資料 33\n3.4.2 資料前處理 33\n3.4.3 建立分群模型 34\n3.4.4 決定分群數 34\n3.5 實驗(三)小組作業的行為模式 35\n3.5.1 實驗資料 35\n3.5.2 課程活動 35\n3.5.3 行為編碼 38\n3.5.4 序列分析 41\n第4章 研究結果與討論 44\n4.1 預測學習成效之效果與成效不佳之學生 44\n4.2 分群結果與小組作業之行為模式 46\n4.3 個案討論 54\n4.4 小結 61\n第5章 結論與未來展望 65\n參考文獻 66\n附錄A 70zh_TW
dc.format.extent5224053 bytes-
dc.format.mimetypeapplication/pdf-
dc.source.urihttp://thesis.lib.nccu.edu.tw/record/#G0108753211en_US
dc.subject教育資料探勘zh_TW
dc.subject深度學習zh_TW
dc.subject長短期記憶模型zh_TW
dc.subjectK-meanszh_TW
dc.subject動態時間校正zh_TW
dc.subject序列分析zh_TW
dc.subjectEducation data miningen_US
dc.subjectDeep learningen_US
dc.subjectLong short-term memoryen_US
dc.subjectK-meansen_US
dc.subjectDynamic time warpingen_US
dc.subjectSequential analysisen_US
dc.title應用深度學習模型、時間序列分群方法和序列分析探討學生的學習表現zh_TW
dc.titleApplying deep learning models, time series clustering methods and sequential analysis to exploring students` learning performanceen_US
dc.typethesisen_US
dc.relation.reference[1] 王承璋。2014。輔助老師之學生核心能力雷達圖群體學生模型之研發。碩士論文。私立元智大學,桃園縣,臺灣。\n[2] 陳馨順。2015。運用關聯規則與分群技術探討放射科實習學生網路學習行為。碩士論文。國立臺北護理健康大學,臺北市,臺灣。\n[3] Allison, P. D., & Liker, J. K. (1982). Analyzing sequential categorical data on dyadic interaction: A comment on Gottman.\n[4] Bakeman, R., & Gottman, J. M. (1997). Observing interaction: An introduction to sequential analysis. Cambridge university press.\n[5] Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of educational data mining, 1(1), 3-17.\n[6] Bakhshinategh, B., Zaiane, O. R., ElAtia, S., & Ipperciel, D. (2018). Educational data mining applications and tasks: A survey of the last 10 years. Education and Information Technologies, 23(1), 537-553.\n[7] Chang, S. C., Hsu, T. C., Kuo, W. C., & Jong, M. S. Y. (2020). Effects of applying a VR‐based two‐tier test strategy to promote elementary students’ learning performance in a Geology class. British Journal of Educational Technology, 51(1), 148-165.\n[8] Chen, F., & Cui, Y. (2020). Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance. Journal of Learning Analytics, 7(2), 1-17.\n[9] Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.\n[10] Chrysafiadi, K., & Virvou, M. (2013). Student modeling approaches: A literature review for the last decade. Expert Systems with Applications, 40(11), 4715-4729.\n[11] Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1), 21-27.\n[12] Dozat, T. (2016). Incorporating nesterov momentum into adam.\n[13] Elman, J. L. (1990). Finding structure in time. Cognitive science, 14(2), 179-211.\n[14] Fotso, J. E. M., Batchakui, B., Nkambou, R., & Okereke, G. Algorithms for the Development of Deep Learning Models for Classification and Prediction of Behaviour in MOOCS. In 2020 IEEE Learning With MOOCS (LWMOOCS) (pp. 180-184). IEEE.\n[15] Géron, A. (2019). Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. O`Reilly Media.\n[16] GSEQ, from: https://www.mangold-international.com/en/products/software/gseq\n[17] Hatcher, W. G., & Yu, W. (2018). A survey of deep learning: Platforms, applications and emerging research trends. IEEE Access, 6, 24411-24432.\n[18] Hernández-Blanco, A., Herrera-Flores, B., Tomás, D., & Navarro-Colorado, B. (2019). A systematic review of deep learning approaches to educational data mining. Complexity, 2019.\n[19] Hochreiter, S. (1998). The vanishing gradient problem during learning recurrent neural nets and problem solutions. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 6(02), 107-116.\n[20] Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.\n[21] Hou, H. T. (2012). Exploring the behavioral patterns of learners in an educational massively multiple online role-playing game (MMORPG). Computers & Education, 58(4), 1225-1233.\n[22] Keras, from: https://github.com/keras-team/keras\n[23] MacQueen, J. (1967, June). Some methods for classification and analysis of multivariate observations. In Proceedings of the fifth Berkeley symposium on mathematical statistics and probability (Vol. 1, No. 14, pp. 281-297).\n[24] Oeda, S., & Hashimoto, G. (2017). Log-data clustering analysis for dropout prediction in beginner programming classes. Procedia computer science, 112, 614-621.\n[25] Okubo, F., Yamashita, T., Shimada, A., & Ogata, H. (2017, March). A neural network approach for students` performance prediction. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 598-599).\n[26] Petitjean, F., Ketterlin, A., & Gançarski, P. (2011). A global averaging method for dynamic time warping, with applications to clustering. Pattern recognition, 44(3), 678-693.\n[27] Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(1), 12-27.\n[28] Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3), e1355.\n[29] Romero, C., Ventura, S., & García, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51(1), 368-384.\n[30] Rousseeuw, P. J. (1987). Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. Journal of computational and applied mathematics, 20, 53-65.\n[31] Rubinstein, R. (1999). The cross-entropy method for combinatorial and continuous optimization. Methodology and computing in applied probability, 1(2), 127-190.\n[32] Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1985). Learning internal representations by error propagation. California Univ San Diego La Jolla Inst for Cognitive Science.\n[33] Sakoe, H., & Chiba, S. (1978). Dynamic programming algorithm optimization for spoken word recognition. IEEE transactions on acoustics, speech, and signal processing, 26(1), 43-49.\n[34] Sokolova, M., & Lapalme, G. (2009). A systematic analysis of performance measures for classification tasks. Information processing & management, 45(4), 427-437.\n[35] Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, 15(1), 1929-1958.\n[36] Tslearn, from: https://github.com/tslearn-team/tslearn/zh_TW
dc.identifier.doi10.6814/NCCU202101631en_US
item.grantfulltextrestricted-
item.openairecristypehttp://purl.org/coar/resource_type/c_46ec-
item.fulltextWith Fulltext-
item.cerifentitytypePublications-
item.openairetypethesis-
Appears in Collections:學位論文
Files in This Item:
File Description SizeFormat
321101.pdf5.1 MBAdobe PDF2View/Open
Show simple item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.