學術產出-學位論文

題名 以視線軌跡為基礎的人機介面
Gaze-based human-computer interaction
作者 余立強
Yu, Li Chiang
貢獻者 廖文宏
Liao, Wen Hung
余立強
Yu, Li Chiang
關鍵詞 眼球追蹤
人機介面
瞳孔偵測
Eye Tracking
HCI
Pupil Detection
日期 2008
上傳時間 19-九月-2009 12:11:37 (UTC+8)
摘要 眼動儀目前的主要用途在分析使用者的觀看行為,藉以改善介面的設計,或幫助身體有缺陷但眼睛還能轉動的使用者與外界溝通。隨著相關技術的發展,眼動儀將可能如同滑鼠、鍵盤一般,成為使用者輸入裝置的選項。本論文的目的在於設計並實作低成本之穿戴式與遠距眼動儀,並將其應用於以視線軌跡為基礎的人機介面,希望能夠增進人與電腦之間的互動方式。由於眼動儀會受到雜訊與角膜反射點等的影響,本研究提出利用瞳孔周圍暗色點比例較高的特性,增加定位之準確性,以改善眼動儀之精確度,此外,頭部的移動亦會造成眼動儀在計算投射位置時之誤差,本研究也針對這個問題提出因應之解決方案。利用前述製作的眼動儀,本論文實作數個以視線軌跡為基礎的人機介面,包括視線軌跡網頁瀏覽器、強化眼睛注視照片區域、井字遊戲、互動式媒體等,並利用眼動儀記錄使用者觀看手機介面的行為。
Eye tracker, a device for measuring eye position and movements, has traditionally been used for research in human visual system, psychology and interface design. It has also served as an input device for people with disability. With recent progresses in hardware and imaging technology, it has the potential to complement, even replace popular devices such as mouse or keyboard for average users to communicate with the computer. It is the objective of this research to design and implement low-cost head-mounted and remote eye trackers and subsequently develop applications that take advantage of gaze-based interactions. Specifically, we improve the precision of the tracking result by designing a new pupil detection algorithm as well as compensating for head movement. We then present several gaze-based user interfaces, including eye-controlled web browser, attention-based photo browser, interactive game (tic-tac-toe) and media design. We also investigate the feasibility of utilizing the eye trackers to analyze and evaluate the design of mobile user interface.
參考文獻 [1] Emiliano Castellina, Faisal Razzak, Fulvio Corno, “Environmental Control Application Compliant with Cogain Guidelines,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009.
[2] Jacob O. Wobbrock, James Rubinstein, Michael Sawyer, Andrew T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures,” The 3h Conference on Communication by Gaze Interaction(COGAIN 2007), 2007.
[3] Howell Istance, Aulikki Hyrskykari, Stephen Vickers, Nazmie Ali, “User Performance of Gaze-Based Interaction with On-line Virtual Communities,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.
[4] Stephen Vickers, Howell Istance, Aulikki Hyrskykari, “Selecting Commands in 3D Game Environments by Gaze Gestures,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009.
[5] 王凱平, 移動式眼動儀之實作與視線軌跡分析, 政治大學資訊科學系碩士班碩士論文, 2008.
[6] Droege Detlev, Schmidt Carola, Paulus Dietrich, “A Comparison of Pupil Centre Estimation,” The 4th Conference on Communication by Gaze Interaction(COGIN 2008), 2008.
[7] Flavio Luiz Coutinho, Carlos Hitoshi Morimoto, "Free head Motion Eye Gaze Tracking Using a Single Camera and Multiple Light Sources," pp.171-178, XIX Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI`06), 2006.
[8] Craig Hennessey, Borna Noureddin, Peter Lawrence, “A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 symposium on Eye tracking research & applications (ETRA’06), 2006.
[9] Dongheng Li, David Winfield, Derrick J. Parkhurst, “Starburst: A Hybrid Algorithm for Video-Based Eye Tracking Combining Feature-Based and Model-Based Approaches,” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPRW`05) Workshops, pp. 79, 2005.
[10] Gintautas Daunys, Nerijus Ramanauskas, “The Accuracy of Eye Tracking Using Image Processing,” pp.377-380, Proceedings of the Third Nordic Conference on Human-Computer Interaction, 2004.
[11] Dongshi Xia, Zongcai Ruan, "IR Image Based Eye Gaze Estimation," vol. 1, pp.220-224, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007), 2007.
[12] Sepehr Attarchi, Karim Faez, Amin Asghari, "A Fast and Accurate Iris Recognition Method Using the Complex Inversion Map and 2DPCA," pp.179-184, Seventh IEEE/ACIS International Conference on Computer and Information Science (ICIS 2008), 2008.
[13] Craig Hennessey, ”Eye-Gaze Tracking With Free Head Motion,” Masters of Applied Science Thesis, University of British Columbia, 2005.
[14] Somnath Dey, Debasis Samanta, “An Efficient Approach for Pupil Detection in Iris Images,” pp.382-389, 15th International Conference on Advanced Computing and Communications (ADCOM 2007), 2007.
[15] Cudel Christophe, Bernet Sacha, Basset Michel. “Fast and Easy Calibration for a Head-Mounted Eye Tracker,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.
[16] Cutrell Edward, Guan Zhiwei, “An Eye Tracking Study of the Effect of Target Rank on Web Search.” Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 2007.
[17] Petr Novák, Tomáš Krajník, Libor Přeučil, Marcela Fejtová, Olga Štěpánková, “AI Support for a Gaze Controlled Wheelchair,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.
[18] Matt Feusner, Brian Lukoff, “Testing for Statistically Significant Differences between Groups of Scan Patterns,” pp.43-46, Proceedings of the 2008 Symposium on Eye tracking Research & Applications(ETRA’08), 2008.
[19] Nguyen Van Huan, Hakil Kim, "A Novel Circle Detection Method for Iris Segmentation," cisp, vol. 3, pp.620-624, 2008 Congress on Image and Signal Processing, Vol. 3, 2008.
[20] Craig Hennessey, Borna Noureddin, Peter Lawrence,“A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 Symposium on Eye tracking Research & Applications(ETRA’06), 2006.
[21] Zhu Zhiwei, Ji Qiang, “Eye Gaze Tracking Under Natural Head Movements,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR’05), 2005.
[22] Teodora Vatahska, Maren Bennewitz, Sven Behnke, ”Feature-Based Head Pose Estimation from Images,” Proceedings of IEEE-RAS 7th International Conference on Humanoid Robots (Humanoids), 2007.
[23] Skovsgaard H.T.Henrik, Hansen Paulin John, Mateo C.Julio, “How Can Tiny Buttons Be Hit Using Gaze Only?” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008). 2008.
[24] Laura Cowen, “An Eye Movement Analysis of Web-Page Usability,” Unpublished Masters’ thesis, Lancaster University, UK, 2001.
[25] Reeder W.Robert, Pirolli Peter, Card K.Stuart, “WebEyeMapper and WebLogger: Tools for Analyzing Eye Tracking Data Collected in Web-Use Studies,” pp.19-20, CHI `01 Extended Abstracts on Human Factors in Computing Systems, 2001.
描述 碩士
國立政治大學
資訊科學學系
96753025
97
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0096753025
資料類型 thesis
dc.contributor.advisor 廖文宏zh_TW
dc.contributor.advisor Liao, Wen Hungen_US
dc.contributor.author (作者) 余立強zh_TW
dc.contributor.author (作者) Yu, Li Chiangen_US
dc.creator (作者) 余立強zh_TW
dc.creator (作者) Yu, Li Chiangen_US
dc.date (日期) 2008en_US
dc.date.accessioned 19-九月-2009 12:11:37 (UTC+8)-
dc.date.available 19-九月-2009 12:11:37 (UTC+8)-
dc.date.issued (上傳時間) 19-九月-2009 12:11:37 (UTC+8)-
dc.identifier (其他 識別碼) G0096753025en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/37119-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學學系zh_TW
dc.description (描述) 96753025zh_TW
dc.description (描述) 97zh_TW
dc.description.abstract (摘要) 眼動儀目前的主要用途在分析使用者的觀看行為,藉以改善介面的設計,或幫助身體有缺陷但眼睛還能轉動的使用者與外界溝通。隨著相關技術的發展,眼動儀將可能如同滑鼠、鍵盤一般,成為使用者輸入裝置的選項。本論文的目的在於設計並實作低成本之穿戴式與遠距眼動儀,並將其應用於以視線軌跡為基礎的人機介面,希望能夠增進人與電腦之間的互動方式。由於眼動儀會受到雜訊與角膜反射點等的影響,本研究提出利用瞳孔周圍暗色點比例較高的特性,增加定位之準確性,以改善眼動儀之精確度,此外,頭部的移動亦會造成眼動儀在計算投射位置時之誤差,本研究也針對這個問題提出因應之解決方案。利用前述製作的眼動儀,本論文實作數個以視線軌跡為基礎的人機介面,包括視線軌跡網頁瀏覽器、強化眼睛注視照片區域、井字遊戲、互動式媒體等,並利用眼動儀記錄使用者觀看手機介面的行為。zh_TW
dc.description.abstract (摘要) Eye tracker, a device for measuring eye position and movements, has traditionally been used for research in human visual system, psychology and interface design. It has also served as an input device for people with disability. With recent progresses in hardware and imaging technology, it has the potential to complement, even replace popular devices such as mouse or keyboard for average users to communicate with the computer. It is the objective of this research to design and implement low-cost head-mounted and remote eye trackers and subsequently develop applications that take advantage of gaze-based interactions. Specifically, we improve the precision of the tracking result by designing a new pupil detection algorithm as well as compensating for head movement. We then present several gaze-based user interfaces, including eye-controlled web browser, attention-based photo browser, interactive game (tic-tac-toe) and media design. We also investigate the feasibility of utilizing the eye trackers to analyze and evaluate the design of mobile user interface.en_US
dc.description.tableofcontents 第一章 緒論 1
1.1 研究背景 1
1.2 研究目的 2
第二章 相關研究 5
2.1 眼動儀的建構 5
2.1.1 找出眼球中心 6
2.1.2 校正程序 8
2.1.3 眼球中心對應到螢幕座標 8
2.2 眼動儀相關的人機介面 9
第三章 瞳孔偵測演算法 11
3.1 瞳孔偵測演算法 11
3.1.1 偵測特徵點 12
3.1.2 利用特徵點找最佳橢圓 18
3.1.3 消除雜點 20
3.1.4 將瞳孔座標投射到螢幕 26
3.1.5 實驗器具 28
3.1.6 精確度分析 29
3.1.7 無法準確定位的情況 31
3.2 瞳孔偵測演算法於遠距式眼動儀 33
3.2.1 標示眼睛區塊 34
3.2.2 找出瞳孔特徵點 34
3.2.3 移除雜點 35
3.2.4 將瞳孔座標投射到螢幕 37
3.2.5 實驗器具 38
3.2.6 精確度分析 38
第四章 輕微的頭部移動處理 41
4.1 頭戴式眼動儀的輕微頭部移動處理 42
4.1.1 找出標誌物 43
4.1.2 頭部移動資訊於校正前與校正後的處理 46
4.1.3 修正頭部後的精確度分析 47
4.1.4 造成不準確的情況與問題 52
4.2 遠距式眼動儀的輕微頭部移動處理 53
4.2.1 頭部轉動資訊於校正前與校正後的處理 54
4.2.2 修正頭部後的精確度分析 54
4.2.3 造成不準確的情況與問題 55
第五章 以視線軌跡為基礎的人機介面 60
5.1 視線軌跡為基礎的網頁瀏覽器 60
5.1.1 接收眼動儀傳出的眼睛注視座標 61
5.1.2 網頁內容做分塊 61
5.1.3 標示目前注視區塊、放大區塊與點擊滑鼠左鍵 63
5.1.4 依據使用者注視位置做捲動 65
5.2 可記錄觀看行為的照片瀏覽器 66
5.2.1 可記錄觀看行為的照片瀏覽器的介面 66
5.2.2 強化眼睛注視照片區域的流程與實作方法 67
5.2.3 可能應用 72
5.3 利用眼動儀紀錄使用者觀看手機介面的行為 72
5.3.1 眼動儀於手機螢幕與一般螢幕的差異 72
5.3.2 實際做法 73
5.3.3 眼動儀於動態影像上的校正 73
5.3.4 記錄視線軌跡觀看手機的位置 75
5.3.5 後續分析 75
5.4 井字遊戲 75
5.5 互動式媒體 76
第六章 結論 79
參考文獻 81
zh_TW
dc.format.extent 283350 bytes-
dc.format.extent 120210 bytes-
dc.format.extent 105739 bytes-
dc.format.extent 189911 bytes-
dc.format.extent 160486 bytes-
dc.format.extent 335076 bytes-
dc.format.extent 708395 bytes-
dc.format.extent 418778 bytes-
dc.format.extent 1143602 bytes-
dc.format.extent 142604 bytes-
dc.format.extent 229805 bytes-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.language.iso en_US-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0096753025en_US
dc.subject (關鍵詞) 眼球追蹤zh_TW
dc.subject (關鍵詞) 人機介面zh_TW
dc.subject (關鍵詞) 瞳孔偵測zh_TW
dc.subject (關鍵詞) Eye Trackingen_US
dc.subject (關鍵詞) HCIen_US
dc.subject (關鍵詞) Pupil Detectionen_US
dc.title (題名) 以視線軌跡為基礎的人機介面zh_TW
dc.title (題名) Gaze-based human-computer interactionen_US
dc.type (資料類型) thesisen
dc.relation.reference (參考文獻) [1] Emiliano Castellina, Faisal Razzak, Fulvio Corno, “Environmental Control Application Compliant with Cogain Guidelines,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009.zh_TW
dc.relation.reference (參考文獻) [2] Jacob O. Wobbrock, James Rubinstein, Michael Sawyer, Andrew T. Duchowski, “Not Typing but Writing: Eye-based Text Entry Using Letter-like Gestures,” The 3h Conference on Communication by Gaze Interaction(COGAIN 2007), 2007.zh_TW
dc.relation.reference (參考文獻) [3] Howell Istance, Aulikki Hyrskykari, Stephen Vickers, Nazmie Ali, “User Performance of Gaze-Based Interaction with On-line Virtual Communities,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.zh_TW
dc.relation.reference (參考文獻) [4] Stephen Vickers, Howell Istance, Aulikki Hyrskykari, “Selecting Commands in 3D Game Environments by Gaze Gestures,” The 5h Conference on Communication by Gaze Interaction(COGAIN 2009), 2009.zh_TW
dc.relation.reference (參考文獻) [5] 王凱平, 移動式眼動儀之實作與視線軌跡分析, 政治大學資訊科學系碩士班碩士論文, 2008.zh_TW
dc.relation.reference (參考文獻) [6] Droege Detlev, Schmidt Carola, Paulus Dietrich, “A Comparison of Pupil Centre Estimation,” The 4th Conference on Communication by Gaze Interaction(COGIN 2008), 2008.zh_TW
dc.relation.reference (參考文獻) [7] Flavio Luiz Coutinho, Carlos Hitoshi Morimoto, "Free head Motion Eye Gaze Tracking Using a Single Camera and Multiple Light Sources," pp.171-178, XIX Brazilian Symposium on Computer Graphics and Image Processing (SIBGRAPI`06), 2006.zh_TW
dc.relation.reference (參考文獻) [8] Craig Hennessey, Borna Noureddin, Peter Lawrence, “A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 symposium on Eye tracking research & applications (ETRA’06), 2006.zh_TW
dc.relation.reference (參考文獻) [9] Dongheng Li, David Winfield, Derrick J. Parkhurst, “Starburst: A Hybrid Algorithm for Video-Based Eye Tracking Combining Feature-Based and Model-Based Approaches,” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPRW`05) Workshops, pp. 79, 2005.zh_TW
dc.relation.reference (參考文獻) [10] Gintautas Daunys, Nerijus Ramanauskas, “The Accuracy of Eye Tracking Using Image Processing,” pp.377-380, Proceedings of the Third Nordic Conference on Human-Computer Interaction, 2004.zh_TW
dc.relation.reference (參考文獻) [11] Dongshi Xia, Zongcai Ruan, "IR Image Based Eye Gaze Estimation," vol. 1, pp.220-224, Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Computing (SNPD 2007), 2007.zh_TW
dc.relation.reference (參考文獻) [12] Sepehr Attarchi, Karim Faez, Amin Asghari, "A Fast and Accurate Iris Recognition Method Using the Complex Inversion Map and 2DPCA," pp.179-184, Seventh IEEE/ACIS International Conference on Computer and Information Science (ICIS 2008), 2008.zh_TW
dc.relation.reference (參考文獻) [13] Craig Hennessey, ”Eye-Gaze Tracking With Free Head Motion,” Masters of Applied Science Thesis, University of British Columbia, 2005.zh_TW
dc.relation.reference (參考文獻) [14] Somnath Dey, Debasis Samanta, “An Efficient Approach for Pupil Detection in Iris Images,” pp.382-389, 15th International Conference on Advanced Computing and Communications (ADCOM 2007), 2007.zh_TW
dc.relation.reference (參考文獻) [15] Cudel Christophe, Bernet Sacha, Basset Michel. “Fast and Easy Calibration for a Head-Mounted Eye Tracker,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.zh_TW
dc.relation.reference (參考文獻) [16] Cutrell Edward, Guan Zhiwei, “An Eye Tracking Study of the Effect of Target Rank on Web Search.” Proceedings of the SIGCHI conference on Human Factors in Computing Systems, 2007.zh_TW
dc.relation.reference (參考文獻) [17] Petr Novák, Tomáš Krajník, Libor Přeučil, Marcela Fejtová, Olga Štěpánková, “AI Support for a Gaze Controlled Wheelchair,” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008), 2008.zh_TW
dc.relation.reference (參考文獻) [18] Matt Feusner, Brian Lukoff, “Testing for Statistically Significant Differences between Groups of Scan Patterns,” pp.43-46, Proceedings of the 2008 Symposium on Eye tracking Research & Applications(ETRA’08), 2008.zh_TW
dc.relation.reference (參考文獻) [19] Nguyen Van Huan, Hakil Kim, "A Novel Circle Detection Method for Iris Segmentation," cisp, vol. 3, pp.620-624, 2008 Congress on Image and Signal Processing, Vol. 3, 2008.zh_TW
dc.relation.reference (參考文獻) [20] Craig Hennessey, Borna Noureddin, Peter Lawrence,“A Single Camera Eye-Gaze Tracking System with Free Head Motion,” pp.87-94, Proceedings of the 2006 Symposium on Eye tracking Research & Applications(ETRA’06), 2006.zh_TW
dc.relation.reference (參考文獻) [21] Zhu Zhiwei, Ji Qiang, “Eye Gaze Tracking Under Natural Head Movements,” Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition(CVPR’05), 2005.zh_TW
dc.relation.reference (參考文獻) [22] Teodora Vatahska, Maren Bennewitz, Sven Behnke, ”Feature-Based Head Pose Estimation from Images,” Proceedings of IEEE-RAS 7th International Conference on Humanoid Robots (Humanoids), 2007.zh_TW
dc.relation.reference (參考文獻) [23] Skovsgaard H.T.Henrik, Hansen Paulin John, Mateo C.Julio, “How Can Tiny Buttons Be Hit Using Gaze Only?” The 4th Conference on Communication by Gaze Interaction(COGAIN 2008). 2008.zh_TW
dc.relation.reference (參考文獻) [24] Laura Cowen, “An Eye Movement Analysis of Web-Page Usability,” Unpublished Masters’ thesis, Lancaster University, UK, 2001.zh_TW
dc.relation.reference (參考文獻) [25] Reeder W.Robert, Pirolli Peter, Card K.Stuart, “WebEyeMapper and WebLogger: Tools for Analyzing Eye Tracking Data Collected in Web-Use Studies,” pp.19-20, CHI `01 Extended Abstracts on Human Factors in Computing Systems, 2001.zh_TW