學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 對於閱讀的感興趣程度與眼動特徵關係之研究
The Research on the Relationship between Interesting Degree of Reading and Eye Movement Features
作者 王加元
Wang, Jia Yuan
貢獻者 陳良弼<br>蔡介立
Chen, Arbee L.P.<br>Tsai, Jie Li
王加元
Wang, Jia Yuan
關鍵詞 眼動
閱讀
感興趣程度
資料探勘
序列資料探勘
重複片段
分類
eye movement
reading
interesting degree
data mining
sequence mining
repeating pattern
classification
日期 2008
上傳時間 17-Sep-2009 14:04:29 (UTC+8)
摘要 現在有許多對於眼動軌跡與人在認知方面的研究,包括理解狀態以及感興趣的程度;其中,閱讀文章時的眼動軌跡是最常被討論及研究的題材。而本研究的目的就是希望探討讀者在閱讀時的眼動軌跡,與其感興趣程度之間是否存在關係。<br>本研究的特色在於,我們不用一般分析眼動時關心每個AOI(area of interest)上的眼動資料,而是希望將眼動資料以序列的方式分析,並且運用資料探勘的方法,找出眼動序列中區分感興趣程度的眼動軌跡特徵的片段。<br>透過對於眼動軌跡的分析,我們希望研究的結果,在未來可以運用在資訊檢索的領域上,成為一種有效的「隱含式回饋(implicit feedback)」的方式,以改善現有資訊檢索效能。
Much research has been performed on the relationship between eye movements and human cognition, including comprehension and interesting degree. The purpose of our research is to find out if there are relationships between eye movements of reading and interesting degree.<br>Instead of analyzing the eye movements on each area of interest, the characteristic of our research is to transform eye movements to sequence data, and to determine the eye movement patterns which discriminate whether user is interesting or not by using the method of data mining.<br>Through the analysis of the eye movements, our research result can be used as one way of implicit feedback of information retrieval to improve the effectiveness of the search engine.
參考文獻 [1] Dario D. Salvucci and Joseph H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” Proceedings of the 2000 symposium on Eye tracking research & applications.
[2] 蔡介立(Jie-Li Tsai), 顏妙璇(Miao-Hsuan Yen), and 汪勁安(Chin-An Wang), “眼球移動測量及在中文閱讀研究之應用,” 應用心理研究, 28期, 91-104, 2005.
[3] Jarkko Salojarvi, Kai Puolamaki, Jaana Simola, Lauri Kovanen, Ilpo Kojo, and Samuel Kaski, “Inferrring Relevance from Eye Movements: Feature Extraction,” Helsinki University of Technology, Publications in Computer and Information Science.
[4] Rayner, K., Chace, K. H., Slattery, T. J. and Ashby, J, “Movements as Reflections of Comprehension Processes in Reading,” Scientific Studies of Reading 10(3): 241-255.
[5] Aulikki Hyrskykari, Paivi Majaranta, Antti Aaltonen, and Kari-Jouko Raiha, “Design Issues of iDict: A Gaze-Assisted Translation Aid,” Proceedings of the 2000 symposium on Eye tracking research & applications.
[6] Bing Pan, Helene A. Hembrooke, Geri K. Gay, Laura A. Granka, Matthew K. Feusner, and Jill K. Newman, “The Determinants of Web Page Viewing Behavior: An Eye-Tracking Study,” Proceedings of the 2004 symposium on Eye tracking research & applications.
[7] Julia M. West, Anne R. Haake, Evelyn P. Rozanski, and Keith S. Karn, “eyePattern: Softwore for Identifying Patterns and Similarities Across Fixation Sequences,” Proceedings of the 2006 symposium on Eye tracking research & applications.
[8] Hidetake Uwano, Masahide Nakamura, Akito Monden and Ken-ichi Matsumoto, “Analyzing Individual Performance of Source Code Review Using Reviewer’s Eye Movement,” Proceedings of the 2006 symposium on Eye tracking research & applications.
[9] Georg Buscher, “Attention-Based Information Retrieval,” ACM SIGIR Conference on Research and Development of Information Retrieval, 2007.
[10] David Hardoon, John Shawe-Taylor, Antti Ajanki, Kai Puolamäki, and Samuel Kaski, “Information Retrieval by Inferring Implicit Queries from eye Movements,” Artificial Intelligence and Statistics, 2007.
[11] 中央研究院漢語平衡語料庫,http://www.aclclp.org.tw/use_asbc_c.php
[12] SHYAMALA DORAISAMY, and STEFAN RÜGER, “Robust Polyphonic Music Retrieval with N-grams,” Journal of Intelligent Information Systems, 21:1, 53–70, 2003.
[13] Jia-Lien Hsu, Arbee L.P. Chen, Hung-Chen Chen, “Finding Approximate Repeating Patterns from Sequence Data,” Proc. International Symposium on Music Information Retrieval, 2004.
描述 碩士
國立政治大學
資訊科學學系
95753009
97
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0095753009
資料類型 thesis
dc.contributor.advisor 陳良弼<br>蔡介立zh_TW
dc.contributor.advisor Chen, Arbee L.P.<br>Tsai, Jie Lien_US
dc.contributor.author (Authors) 王加元zh_TW
dc.contributor.author (Authors) Wang, Jia Yuanen_US
dc.creator (作者) 王加元zh_TW
dc.creator (作者) Wang, Jia Yuanen_US
dc.date (日期) 2008en_US
dc.date.accessioned 17-Sep-2009 14:04:29 (UTC+8)-
dc.date.available 17-Sep-2009 14:04:29 (UTC+8)-
dc.date.issued (上傳時間) 17-Sep-2009 14:04:29 (UTC+8)-
dc.identifier (Other Identifiers) G0095753009en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/32694-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學學系zh_TW
dc.description (描述) 95753009zh_TW
dc.description (描述) 97zh_TW
dc.description.abstract (摘要) 現在有許多對於眼動軌跡與人在認知方面的研究,包括理解狀態以及感興趣的程度;其中,閱讀文章時的眼動軌跡是最常被討論及研究的題材。而本研究的目的就是希望探討讀者在閱讀時的眼動軌跡,與其感興趣程度之間是否存在關係。<br>本研究的特色在於,我們不用一般分析眼動時關心每個AOI(area of interest)上的眼動資料,而是希望將眼動資料以序列的方式分析,並且運用資料探勘的方法,找出眼動序列中區分感興趣程度的眼動軌跡特徵的片段。<br>透過對於眼動軌跡的分析,我們希望研究的結果,在未來可以運用在資訊檢索的領域上,成為一種有效的「隱含式回饋(implicit feedback)」的方式,以改善現有資訊檢索效能。zh_TW
dc.description.abstract (摘要) Much research has been performed on the relationship between eye movements and human cognition, including comprehension and interesting degree. The purpose of our research is to find out if there are relationships between eye movements of reading and interesting degree.<br>Instead of analyzing the eye movements on each area of interest, the characteristic of our research is to transform eye movements to sequence data, and to determine the eye movement patterns which discriminate whether user is interesting or not by using the method of data mining.<br>Through the analysis of the eye movements, our research result can be used as one way of implicit feedback of information retrieval to improve the effectiveness of the search engine.en_US
dc.description.tableofcontents 第一章 導論及研究動機 1
第二章 相關研究 3
第三章 研究方法 8
第四章 實驗結果分析與討論 36
第五章 結論與未來方向 48
參考文獻 50
zh_TW
dc.format.extent 100848 bytes-
dc.format.extent 127107 bytes-
dc.format.extent 146541 bytes-
dc.format.extent 148515 bytes-
dc.format.extent 148019 bytes-
dc.format.extent 178941 bytes-
dc.format.extent 602562 bytes-
dc.format.extent 214140 bytes-
dc.format.extent 131871 bytes-
dc.format.extent 120524 bytes-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.language.iso en_US-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0095753009en_US
dc.subject (關鍵詞) 眼動zh_TW
dc.subject (關鍵詞) 閱讀zh_TW
dc.subject (關鍵詞) 感興趣程度zh_TW
dc.subject (關鍵詞) 資料探勘zh_TW
dc.subject (關鍵詞) 序列資料探勘zh_TW
dc.subject (關鍵詞) 重複片段zh_TW
dc.subject (關鍵詞) 分類zh_TW
dc.subject (關鍵詞) eye movementen_US
dc.subject (關鍵詞) readingen_US
dc.subject (關鍵詞) interesting degreeen_US
dc.subject (關鍵詞) data miningen_US
dc.subject (關鍵詞) sequence miningen_US
dc.subject (關鍵詞) repeating patternen_US
dc.subject (關鍵詞) classificationen_US
dc.title (題名) 對於閱讀的感興趣程度與眼動特徵關係之研究zh_TW
dc.title (題名) The Research on the Relationship between Interesting Degree of Reading and Eye Movement Featuresen_US
dc.type (資料類型) thesisen
dc.relation.reference (參考文獻) [1] Dario D. Salvucci and Joseph H. Goldberg, “Identifying fixations and saccades in eye-tracking protocols,” Proceedings of the 2000 symposium on Eye tracking research & applications.zh_TW
dc.relation.reference (參考文獻) [2] 蔡介立(Jie-Li Tsai), 顏妙璇(Miao-Hsuan Yen), and 汪勁安(Chin-An Wang), “眼球移動測量及在中文閱讀研究之應用,” 應用心理研究, 28期, 91-104, 2005.zh_TW
dc.relation.reference (參考文獻) [3] Jarkko Salojarvi, Kai Puolamaki, Jaana Simola, Lauri Kovanen, Ilpo Kojo, and Samuel Kaski, “Inferrring Relevance from Eye Movements: Feature Extraction,” Helsinki University of Technology, Publications in Computer and Information Science.zh_TW
dc.relation.reference (參考文獻) [4] Rayner, K., Chace, K. H., Slattery, T. J. and Ashby, J, “Movements as Reflections of Comprehension Processes in Reading,” Scientific Studies of Reading 10(3): 241-255.zh_TW
dc.relation.reference (參考文獻) [5] Aulikki Hyrskykari, Paivi Majaranta, Antti Aaltonen, and Kari-Jouko Raiha, “Design Issues of iDict: A Gaze-Assisted Translation Aid,” Proceedings of the 2000 symposium on Eye tracking research & applications.zh_TW
dc.relation.reference (參考文獻) [6] Bing Pan, Helene A. Hembrooke, Geri K. Gay, Laura A. Granka, Matthew K. Feusner, and Jill K. Newman, “The Determinants of Web Page Viewing Behavior: An Eye-Tracking Study,” Proceedings of the 2004 symposium on Eye tracking research & applications.zh_TW
dc.relation.reference (參考文獻) [7] Julia M. West, Anne R. Haake, Evelyn P. Rozanski, and Keith S. Karn, “eyePattern: Softwore for Identifying Patterns and Similarities Across Fixation Sequences,” Proceedings of the 2006 symposium on Eye tracking research & applications.zh_TW
dc.relation.reference (參考文獻) [8] Hidetake Uwano, Masahide Nakamura, Akito Monden and Ken-ichi Matsumoto, “Analyzing Individual Performance of Source Code Review Using Reviewer’s Eye Movement,” Proceedings of the 2006 symposium on Eye tracking research & applications.zh_TW
dc.relation.reference (參考文獻) [9] Georg Buscher, “Attention-Based Information Retrieval,” ACM SIGIR Conference on Research and Development of Information Retrieval, 2007.zh_TW
dc.relation.reference (參考文獻) [10] David Hardoon, John Shawe-Taylor, Antti Ajanki, Kai Puolamäki, and Samuel Kaski, “Information Retrieval by Inferring Implicit Queries from eye Movements,” Artificial Intelligence and Statistics, 2007.zh_TW
dc.relation.reference (參考文獻) [11] 中央研究院漢語平衡語料庫,http://www.aclclp.org.tw/use_asbc_c.phpzh_TW
dc.relation.reference (參考文獻) [12] SHYAMALA DORAISAMY, and STEFAN RÜGER, “Robust Polyphonic Music Retrieval with N-grams,” Journal of Intelligent Information Systems, 21:1, 53–70, 2003.zh_TW
dc.relation.reference (參考文獻) [13] Jia-Lien Hsu, Arbee L.P. Chen, Hung-Chen Chen, “Finding Approximate Repeating Patterns from Sequence Data,” Proc. International Symposium on Music Information Retrieval, 2004.zh_TW