學術產出-學位論文

題名 移動式眼動儀之實作與視線軌跡分析
Mobile eye tracker construction and gaze path analysis
作者 王凱平
Wang, Kai Pin
貢獻者 廖文宏
Liao, Wen Hung
王凱平
Wang, Kai Pin
關鍵詞 眼動儀
視線軌跡
日期 2008
上傳時間 17-九月-2009 14:04:58 (UTC+8)
摘要 本論文的研究目的在於實作一個低成本的眼動儀,主要硬體是以紅外線無線攝影機來提供眼睛的紅外線影像,可見光攝影機提供景觀影像,並利用其無線傳輸的功能來裝置成頭戴式的可移動眼動儀。眼動儀的軟體部分主要是參考StarBurst演算法完成程式的編寫,除了眼動儀裝置的建構之外,本論文利用自製眼動儀來實作了三個人機介面的應用(包含Eye scrolling、Eye typing以及Eye controlled game)以及找尋AOI的心理認知實驗。

本論文也針對視線軌跡的部分提出分析之演算法,考量資料的準確性,用來分析的實驗數據主要是以iView X Hi-Speed所產生的眼動紀錄為主。在視線軌跡的比對方面提出了兩個做法,第一個是遞迴交集演算法,第二個是修改的Dynamic Time Wrapping演算法,以此兩個方法來對視線軌跡作比對。另外,為了找出在時間序列上的視線共同區域,在本論文的做法是先將2維空間的座標序列根據切割尺寸來轉換成1維的座標序列,然後再以Longest Common Sequences(LCSS)的方式作分析,以期望能以資訊科學的角度來分析視覺軌跡所呈現的心理資訊。
參考文獻 [1] SMI product page, http://www.smivision.com/
[2] G. Underwood, “Eye guidance in reading and scene perception,” Nottingham, Elsevier Science, 1998.
[3] D. Li, D. Winfield, D. J. Parkhurst, “Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPRW`05) Workshops, pp. 79, 2005.
[4] M. Kumar, T. Winograd, A. Paepcke, “Gaze-enhanced scrolling techniques,” CHI ’07 extended abstracts on Human factors in computing systems, pp.2531-2536, 2007.
[5] D. Fono, R. Vertegaal, “EyeWindows: Evaluation of Eye-ControlledZooming Windows for Focus Selection,” CHI ’05 extended abstracts on Human factors in computing systems, pp.151-160, 2005.
[6] D. W. Hansen, J. P.Hansen, “Eye Typing with Common Cameras,” in Proceedings of the 2006 symposium on Eye tracking research & applications, pp.55, 2006.
[7] “Eye Controlled Assistive Technology,” http://www.tobii.com
[8] D. Hirschberg, “Algorithms for the longest common subsequence problem,” Jounal of the ACM24, pp. 664-675, 1977.
[9] “Eye tracking,” http://en.wikipedia.org/wiki/Eye_tracking
[10] R.J.K. Jacob, “Eye Tracking in Human–Computer Interaction and Usability Research: Ready to Deliver the Promises,” in the Mind`s Eyes: Cognitive and Applied Aspects of Eye Movements. Oxford: Elsevier Science, pp.573-605, 2003.
[11] Z. Zhu and Q. Ji, “Eye and gaze tracking for interactive graphic display,” Machine Vision and Applications, vol. 15, no. 3, pp. 139–148, 2004.
[12] T.E. Hutchinson, K.P. White Jr., K.C. Reichert, and L.A. Frey, “Human-computer interaction using eye-gaze input,” IEEE Transactions on Systems, Man, and Cybernetics, pp. 1527–1533, 1989.
[13] Y. Ebisawa, M. Ohtani, and A. Sugioka, “Proposal of a zoom and focus control method using an ultrasonic distance-meter for video-based eye-gaze detection under free-hand condition,” in Proceedings of the 18th Annual International conference of the IEEE Eng. in Medicine and Biology Society, 1996.
[14] C.H. Morimoto, D. Koons, A. Amir, and M. Flickner, “Frame-rate pupil detector and gaze tracker,” IEEE ICCV’99 Frame-rate Workshop, 1999.
[15] LC Technologies Inc, http://www.eyegaze.com
[16] D. A. Forsyth, J. Ponce, “Computer Vision A Modern Approach”, Prentice Hall, 2002.
[17] R. Hartley, A. Zisserman, “Multiple view geometry in computer vision Second Edition”, Cambridge, Cambridge University Press, 2004.
[18] R. Vertegaal, “What do the eyes behold for human-computer interaction?”, in Proceedings of the 2002 symposium on Eye tracking research & applications, pp. 59-60, 2002.
[19] G.. Hotchkiss, “Chinese eye tracking study: Baidu Vs Google,”
http://searchengineland.com/070615-081218.php
[20] C.S. Myers, and L. R. Rabiner, “A comparative study of several dynamic time-warping algorithms for connected word recognition”, The Bell System Technical Journal, 60(7), pp. 1389-1409, Sept.1981.
描述 碩士
國立政治大學
資訊科學學系
95753018
97
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0095753018
資料類型 thesis
dc.contributor.advisor 廖文宏zh_TW
dc.contributor.advisor Liao, Wen Hungen_US
dc.contributor.author (作者) 王凱平zh_TW
dc.contributor.author (作者) Wang, Kai Pinen_US
dc.creator (作者) 王凱平zh_TW
dc.creator (作者) Wang, Kai Pinen_US
dc.date (日期) 2008en_US
dc.date.accessioned 17-九月-2009 14:04:58 (UTC+8)-
dc.date.available 17-九月-2009 14:04:58 (UTC+8)-
dc.date.issued (上傳時間) 17-九月-2009 14:04:58 (UTC+8)-
dc.identifier (其他 識別碼) G0095753018en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/32698-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學學系zh_TW
dc.description (描述) 95753018zh_TW
dc.description (描述) 97zh_TW
dc.description.abstract (摘要) 本論文的研究目的在於實作一個低成本的眼動儀,主要硬體是以紅外線無線攝影機來提供眼睛的紅外線影像,可見光攝影機提供景觀影像,並利用其無線傳輸的功能來裝置成頭戴式的可移動眼動儀。眼動儀的軟體部分主要是參考StarBurst演算法完成程式的編寫,除了眼動儀裝置的建構之外,本論文利用自製眼動儀來實作了三個人機介面的應用(包含Eye scrolling、Eye typing以及Eye controlled game)以及找尋AOI的心理認知實驗。

本論文也針對視線軌跡的部分提出分析之演算法,考量資料的準確性,用來分析的實驗數據主要是以iView X Hi-Speed所產生的眼動紀錄為主。在視線軌跡的比對方面提出了兩個做法,第一個是遞迴交集演算法,第二個是修改的Dynamic Time Wrapping演算法,以此兩個方法來對視線軌跡作比對。另外,為了找出在時間序列上的視線共同區域,在本論文的做法是先將2維空間的座標序列根據切割尺寸來轉換成1維的座標序列,然後再以Longest Common Sequences(LCSS)的方式作分析,以期望能以資訊科學的角度來分析視覺軌跡所呈現的心理資訊。
zh_TW
dc.description.tableofcontents 第一章 研究目的.............................................1
第二章 相關研究.............................................5
第三章 眼動儀基本原理........................................8
3.1. StarBurst演算法介紹.................................8
3.2. 影像前處理.........................................11
3.3. 特徵點偵測.........................................13
3.4. 橢圓形模型的產生與最佳化.............................16
3.5. 校正程序...........................................20
3.6. 投射位置計算.......................................21
第四章 眼動儀的改進與驗證....................................25
4.1. 提高橢圓穩定度.....................................25
4.2. 以門檻值偵測瞳孔....................................31
4.3. 準確度驗證(validation).............................34
第五章 人機介面與心理研究的應用..............................37
5.1. 人機介面的實作一: Eye Scrolling....................37
5.2. 人機介面的實作二: Eye Gaming.......................40
5.3. 人機介面的實作三: Eye Typing.......................43
5.3.1. Eye typing系統流程說明.............................43
5.3.2. Eye typing實作結果.................................44
5.4. 心理認知的實驗.....................................48
5.4.1. 靜態景觀影像........................................48
5.4.2. 動態景觀影像........................................50
5.5. 找尋AOI...........................................51
5.5.1. AOI分析的系統流程..................................52
5.5.2. AOI的找尋結果......................................54
第六章 視線軌跡的分析.......................................64
6.1. 方法一 遞迴交集....................................64
6.1.1. 遞迴交集演算法與性質說明.............................65
6.1.2. 遞迴交集演算法比較軌跡的結果.........................67
6.2. 方法二 Modified Dynamic Time Warping (MDTW).......71
6.2.1. MDTW演算法與性質說明...............................72
6.2.2. MDTW演算法比較軌跡的結果...........................73
6.3. 2 Dimensional Longest Common Subsequences (2D LCSS).....................................................77
6.3.1. 2D LCSS演算法與性質說明.............................78
6.3.2. 2D LCSS找尋共同區域的結果...........................80
第七章 結論與未來規劃.......................................82
參考文獻...................................................84
附錄一 遞迴交集法執行結果....................................86
附錄二 MDTW執行結果........................................92
zh_TW
dc.format.extent 42928 bytes-
dc.format.extent 98046 bytes-
dc.format.extent 119283 bytes-
dc.format.extent 98085 bytes-
dc.format.extent 290667 bytes-
dc.format.extent 216849 bytes-
dc.format.extent 487190 bytes-
dc.format.extent 522776 bytes-
dc.format.extent 1522260 bytes-
dc.format.extent 1153635 bytes-
dc.format.extent 194231 bytes-
dc.format.extent 70539 bytes-
dc.format.extent 1232751 bytes-
dc.format.extent 1172423 bytes-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.language.iso en_US-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0095753018en_US
dc.subject (關鍵詞) 眼動儀zh_TW
dc.subject (關鍵詞) 視線軌跡zh_TW
dc.title (題名) 移動式眼動儀之實作與視線軌跡分析zh_TW
dc.title (題名) Mobile eye tracker construction and gaze path analysisen_US
dc.type (資料類型) thesisen
dc.relation.reference (參考文獻) [1] SMI product page, http://www.smivision.com/zh_TW
dc.relation.reference (參考文獻) [2] G. Underwood, “Eye guidance in reading and scene perception,” Nottingham, Elsevier Science, 1998.zh_TW
dc.relation.reference (參考文獻) [3] D. Li, D. Winfield, D. J. Parkhurst, “Starburst: A hybrid algorithm for video-based eye tracking combining feature-based and model-based approaches,” 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPRW`05) Workshops, pp. 79, 2005.zh_TW
dc.relation.reference (參考文獻) [4] M. Kumar, T. Winograd, A. Paepcke, “Gaze-enhanced scrolling techniques,” CHI ’07 extended abstracts on Human factors in computing systems, pp.2531-2536, 2007.zh_TW
dc.relation.reference (參考文獻) [5] D. Fono, R. Vertegaal, “EyeWindows: Evaluation of Eye-ControlledZooming Windows for Focus Selection,” CHI ’05 extended abstracts on Human factors in computing systems, pp.151-160, 2005.zh_TW
dc.relation.reference (參考文獻) [6] D. W. Hansen, J. P.Hansen, “Eye Typing with Common Cameras,” in Proceedings of the 2006 symposium on Eye tracking research & applications, pp.55, 2006.zh_TW
dc.relation.reference (參考文獻) [7] “Eye Controlled Assistive Technology,” http://www.tobii.comzh_TW
dc.relation.reference (參考文獻) [8] D. Hirschberg, “Algorithms for the longest common subsequence problem,” Jounal of the ACM24, pp. 664-675, 1977.zh_TW
dc.relation.reference (參考文獻) [9] “Eye tracking,” http://en.wikipedia.org/wiki/Eye_trackingzh_TW
dc.relation.reference (參考文獻) [10] R.J.K. Jacob, “Eye Tracking in Human–Computer Interaction and Usability Research: Ready to Deliver the Promises,” in the Mind`s Eyes: Cognitive and Applied Aspects of Eye Movements. Oxford: Elsevier Science, pp.573-605, 2003.zh_TW
dc.relation.reference (參考文獻) [11] Z. Zhu and Q. Ji, “Eye and gaze tracking for interactive graphic display,” Machine Vision and Applications, vol. 15, no. 3, pp. 139–148, 2004.zh_TW
dc.relation.reference (參考文獻) [12] T.E. Hutchinson, K.P. White Jr., K.C. Reichert, and L.A. Frey, “Human-computer interaction using eye-gaze input,” IEEE Transactions on Systems, Man, and Cybernetics, pp. 1527–1533, 1989.zh_TW
dc.relation.reference (參考文獻) [13] Y. Ebisawa, M. Ohtani, and A. Sugioka, “Proposal of a zoom and focus control method using an ultrasonic distance-meter for video-based eye-gaze detection under free-hand condition,” in Proceedings of the 18th Annual International conference of the IEEE Eng. in Medicine and Biology Society, 1996.zh_TW
dc.relation.reference (參考文獻) [14] C.H. Morimoto, D. Koons, A. Amir, and M. Flickner, “Frame-rate pupil detector and gaze tracker,” IEEE ICCV’99 Frame-rate Workshop, 1999.zh_TW
dc.relation.reference (參考文獻) [15] LC Technologies Inc, http://www.eyegaze.comzh_TW
dc.relation.reference (參考文獻) [16] D. A. Forsyth, J. Ponce, “Computer Vision A Modern Approach”, Prentice Hall, 2002.zh_TW
dc.relation.reference (參考文獻) [17] R. Hartley, A. Zisserman, “Multiple view geometry in computer vision Second Edition”, Cambridge, Cambridge University Press, 2004.zh_TW
dc.relation.reference (參考文獻) [18] R. Vertegaal, “What do the eyes behold for human-computer interaction?”, in Proceedings of the 2002 symposium on Eye tracking research & applications, pp. 59-60, 2002.zh_TW
dc.relation.reference (參考文獻) [19] G.. Hotchkiss, “Chinese eye tracking study: Baidu Vs Google,”zh_TW
dc.relation.reference (參考文獻) http://searchengineland.com/070615-081218.phpzh_TW
dc.relation.reference (參考文獻) [20] C.S. Myers, and L. R. Rabiner, “A comparative study of several dynamic time-warping algorithms for connected word recognition”, The Bell System Technical Journal, 60(7), pp. 1389-1409, Sept.1981.zh_TW