Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/137678
題名: 應用深度學習模型、時間序列分群方法和序列分析探討學生的學習表現
Applying deep learning models, time series clustering methods and sequential analysis to exploring students` learning performance
作者: 林英儒
Lin, Ying-Ru
貢獻者: 江玥慧
Chiang, Yueh-hui
林英儒
Lin, Ying-Ru
關鍵詞: 教育資料探勘
深度學習
長短期記憶模型
K-means
動態時間校正
序列分析
Education data mining
Deep learning
Long short-term memory
K-means
Dynamic time warping
Sequential analysis
日期: 2021
上傳時間: 1-十一月-2021
摘要: 在面對面的實體教室中,教學現場的人員比較容易觀察學生於課堂中的學習狀況;當學生在學習過程中遇到問題時,也較能清楚地了解問題所在,幫助學生解決問題。不過在課堂以外的時間,教學人員不易得知學生的學習狀況與學習過程。因此,本研究希望透過學習管理系統收集學生在學習過程中的日誌資料(Logs),並使用深度學習模型、時間序列分群方法和序列分析探討學生於課程中的學習表現,最後將研究結果回饋給教學現場的人員,使老師和助教能夠幫助學習進度較緩慢、或是在學習過程中遇到問題的學生。
In a face-to-face classroom, it’s easier for teachers to observe students’ learning process. When students encounter problems in class, teachers can see the situations and help students solve the problems. However, it`s not easy for teachers to know students’ learning conditions and learning process outside of class. Therefore, this study collected students’ learning log data from a learning management system, and used deep learning models, time series clustering methods, and sequential analysis to explore students’ learning performance in the courses. The results of this study can contribute to helping teachers identify low-performance students so as to provide necessary assistance.
參考文獻: [1] 王承璋。2014。輔助老師之學生核心能力雷達圖群體學生模型之研發。碩士論文。私立元智大學,桃園縣,臺灣。\n[2] 陳馨順。2015。運用關聯規則與分群技術探討放射科實習學生網路學習行為。碩士論文。國立臺北護理健康大學,臺北市,臺灣。\n[3] Allison, P. D., & Liker, J. K. (1982). Analyzing sequential categorical data on dyadic interaction: A comment on Gottman.\n[4] Bakeman, R., & Gottman, J. M. (1997). Observing interaction: An introduction to sequential analysis. Cambridge university press.\n[5] Baker, R. S., & Yacef, K. (2009). The state of educational data mining in 2009: A review and future visions. Journal of educational data mining, 1(1), 3-17.\n[6] Bakhshinategh, B., Zaiane, O. R., ElAtia, S., & Ipperciel, D. (2018). Educational data mining applications and tasks: A survey of the last 10 years. Education and Information Technologies, 23(1), 537-553.\n[7] Chang, S. C., Hsu, T. C., Kuo, W. C., & Jong, M. S. Y. (2020). Effects of applying a VR‐based two‐tier test strategy to promote elementary students’ learning performance in a Geology class. British Journal of Educational Technology, 51(1), 148-165.\n[8] Chen, F., & Cui, Y. (2020). Utilizing Student Time Series Behaviour in Learning Management Systems for Early Prediction of Course Performance. Journal of Learning Analytics, 7(2), 1-17.\n[9] Cho, K., Van Merriënboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., & Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv preprint arXiv:1406.1078.\n[10] Chrysafiadi, K., & Virvou, M. (2013). Student modeling approaches: A literature review for the last decade. Expert Systems with Applications, 40(11), 4715-4729.\n[11] Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE transactions on information theory, 13(1), 21-27.\n[12] Dozat, T. (2016). Incorporating nesterov momentum into adam.\n[13] Elman, J. L. (1990). Finding structure in time. Cognitive science, 14(2), 179-211.\n[14] Fotso, J. E. M., Batchakui, B., Nkambou, R., & Okereke, G. Algorithms for the Development of Deep Learning Models for Classification and Prediction of Behaviour in MOOCS. In 2020 IEEE Learning With MOOCS (LWMOOCS) (pp. 180-184). IEEE.\n[15] Géron, A. (2019). Hands-on machine learning with Scikit-Learn, Keras, and TensorFlow: Concepts, tools, and techniques to build intelligent systems. O`Reilly Media.\n[16] GSEQ, from: https://www.mangold-international.com/en/products/software/gseq\n[17] Hatcher, W. G., & Yu, W. (2018). A survey of deep learning: Platforms, applications and emerging research trends. IEEE Access, 6, 24411-24432.\n[18] Hernández-Blanco, A., Herrera-Flores, B., Tomás, D., & Navarro-Colorado, B. (2019). A systematic review of deep learning approaches to educational data mining. Complexity, 2019.\n[19] Hochreiter, S. (1998). The vanishing gradient problem during learning recurrent neural nets and problem solutions. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 6(02), 107-116.\n[20] Hochreiter, S., & Schmidhuber, J. (1997). Long short-term memory. Neural computation, 9(8), 1735-1780.\n[21] Hou, H. T. (2012). Exploring the behavioral patterns of learners in an educational massively multiple online role-playing game (MMORPG). Computers & Education, 58(4), 1225-1233.\n[22] Keras, from: https://github.com/keras-team/keras\n[23] MacQueen, J. (1967, June). Some methods for classification and analysis of multivariate observations. In Proceedings of the fifth Berkeley symposium on mathematical statistics and probability (Vol. 1, No. 14, pp. 281-297).\n[24] Oeda, S., & Hashimoto, G. (2017). Log-data clustering analysis for dropout prediction in beginner programming classes. Procedia computer science, 112, 614-621.\n[25] Okubo, F., Yamashita, T., Shimada, A., & Ogata, H. (2017, March). A neural network approach for students` performance prediction. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 598-599).\n[26] Petitjean, F., Ketterlin, A., & Gançarski, P. (2011). A global averaging method for dynamic time warping, with applications to clustering. Pattern recognition, 44(3), 678-693.\n[27] Romero, C., & Ventura, S. (2013). Data mining in education. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 3(1), 12-27.\n[28] Romero, C., & Ventura, S. (2020). Educational data mining and learning analytics: An updated survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3), e1355.\n[29] Romero, C., Ventura, S., & García, E. (2008). Data mining in course management systems: Moodle case study and tutorial. Computers & Education, 51(1), 368-384.\n[30] Rousseeuw, P. J. (1987). Silhouettes: a graphical aid to the interpretation and validation of cluster analysis. Journal of computational and applied mathematics, 20, 53-65.\n[31] Rubinstein, R. (1999). The cross-entropy method for combinatorial and continuous optimization. Methodology and computing in applied probability, 1(2), 127-190.\n[32] Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1985). Learning internal representations by error propagation. California Univ San Diego La Jolla Inst for Cognitive Science.\n[33] Sakoe, H., & Chiba, S. (1978). Dynamic programming algorithm optimization for spoken word recognition. IEEE transactions on acoustics, speech, and signal processing, 26(1), 43-49.\n[34] Sokolova, M., & Lapalme, G. (2009). A systematic analysis of performance measures for classification tasks. Information processing & management, 45(4), 427-437.\n[35] Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., & Salakhutdinov, R. (2014). Dropout: a simple way to prevent neural networks from overfitting. The journal of machine learning research, 15(1), 1929-1958.\n[36] Tslearn, from: https://github.com/tslearn-team/tslearn/
描述: 碩士
國立政治大學
資訊科學系
108753211
資料來源: http://thesis.lib.nccu.edu.tw/record/#G0108753211
資料類型: thesis
Appears in Collections:學位論文

Files in This Item:
File Description SizeFormat
321101.pdf5.1 MBAdobe PDF2View/Open
Show full item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.