Please use this identifier to cite or link to this item: https://ah.lib.nccu.edu.tw/handle/140.119/134854
題名: Developing an Instant Semantic Analysis and Feedback System to Facilitate Learning Performance of Online Discussion
作者: 陳志銘
Chen, Chih-Ming 
Li, Ming-Chaun
Huang, Ya-Ling
貢獻者: 圖檔所
日期: May-2023
上傳時間: 22-Apr-2021
摘要: By applying two-mode social networks and Chinese word segmentation technologies, a novel visualization tool, the instant semantic analysis and feedback system (ISAFS), is designed in this study to present the semantic networks of co-words and non-co-words used in learners’ discussion processes and assist learners in grasping the discussion direction to enhance online learning effectiveness. By conducting a quasi-experimental research, the ISAFS-assisted discussion board was assigned to the 32 learners of one class as the experimental group, whereas the general online discussion board was assigned to the 32 learners of the other class as the control group to support the online discussion learning activities of socio-scientific issues (SSIs). Analytical results show that the mean scores of overall, complexity, and perspective of socio-scientific reasoning performance have no statistically significant differences between the two groups. However, the in-depth interview results suggest that the learners in the experimental group gave positive feedback on the design concepts of the ISAFS and could explore the complexity and various perspectives of the issue through co-words and non-co-words presented on the ISAFS graph. Several suggestions regarding the functions and user interface of the ISAFS were also collected from the interviews to provide the directions for future studies.
關聯: Interactive Learning Environments, Vol.31, No.3, pp.1402-1420
資料類型: article
DOI: https://doi.org/10.1080/10494820.2020.1839505
Appears in Collections:期刊論文

Files in This Item:
File Description SizeFormat
194.pdf2.39 MBAdobe PDF2View/Open
Show full item record

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.