學術產出-Theses

Article View/Open

Publication Export

Google ScholarTM

政大圖書館

Citation Infomation

  • No doi shows Citation Infomation
題名 智慧型仿鏡互動顯示裝置
Magic Mirror: A Research on Smart Display Devices
作者 葉致偉
Yeh, Chih-Wei
貢獻者 廖文宏
Liao, Wen-Hung
葉致偉
Yeh, Chih-Wei
關鍵詞 人機介面
智慧顯示裝置
智慧家電
肢體人機介面
Human Computer Interface
Smart Display Device
Smart Furniture
Gesture User Inteface
日期 2005
上傳時間 17-Sep-2009 14:09:34 (UTC+8)
摘要 以肢體動作為基礎的人機介面一直被認為是未來家庭人機介面的表徵,然而,由於缺乏適合的應用環境和辨識技術,相關的應用尚未成熟。本研究嘗試提出一個互動式仿鏡顯示裝置作為肢體指令的平台,並提出相關的辨識技術,以設計一個可應用在智慧家庭環境中,符合人因工程的互動顯示裝置。
Gesture-based user interfaces have long been associated with the image of future technology. However, due to the lack of proper environments and recognition technologies, practical applications of intelligent user interfaces are still rare in modern life. In this research, we propose an interactive mirror which can be controlled by gesture commands. We also provide several recognition techniques for this interactive display device. Practical applications are developed on this smart mirror, and user test is conducted to evaluate this novel user interface.
參考文獻 [1] E. Lee, T. Nakra Marrin and J. Borchers, "You`re the conductor: A realistic interactive conducting system for children," in Proc. 2004 Int. Conf. on New Interfaces for Musical Expression, Hamamatsu, Japan, pp. 68-73, Jun. 2004.
[2] V. Henderson, S. Lee, H. Brashear, H. Hamilton, T. Starner and S. Hamilton, "Development of an American Sign Language game for deaf children," in Proc. 4th Int. Conf. on Interaction Design and Children, Boulder, CO, USA, pp. 70-79, Jun. 2005.
[3] L. Zhang, Y. Chen, G. Fang, X. Chen and W. Gao, "A vision-based sign language recognition system using tied-mixture density HMM," in Proc. 6th Int. Conf. on Multimodal Interfaces, State College, PA, USA, pp. 198-204, Oct. 2004.
[4] W. Freeman and C. Weissman, "Television control by hand gestures," in Proc. IEEE Int. Workshop on Automatic Face and Gesture Recognition, 1995, pp. 179-183, Jun. 1995.
[5] A. Wilson and N. Oliver, "GWindows: robust stereo vision for gesture-based control of windows," in Proc. 5th Int. Conf. on Multimodal Interfaces, Vancouver, British Columbia, Canada, pp. 211-218, Nov. 5-7 2003.
[6] R. Vertegaal, "Attentive user interfaces: Introduction," Commun. ACM, vol. 46, pp. 30-33, 2003.
[7] http://www.research.philips.com/technologies/display/mrrordisp/mirrortv/
[8] http://www.seuratvmirror.com
[9] T. Darrell, G. Gorden, J. Woodfill and M. Harville, "A Virtual Mirror Interface using Real-time Robust Face Tracking," in Proc. 3th IEEE Int. Conf. on Automatic Face and Gesture Recognition, Nara, Japan, pp. 616-621, Apr. 14-16, 1998.
[10] W. Liao and T. Y. Li, "DD Theme Party: An Interactive Multimedia Showcase," in Proc. 2005 Int. Conf. on Digital Archive Technologies, Taipei, Taiwan, pp. 279-280, Jun. 16-17, 2005.
[11] J. Goto, K. Komine, Y. Kim and N. Uratani, "A Television Control System based on Spoken Natural Language Dialogue," in Proc. 9th IFIP TC13 Int. Conf. on Human-Computer Interaction, Zürich, Switzerland, pp. 765-768, Sep. 1-5. 2003.
[12] J.S. Shell, R. Vertegaal and A.W. Skaburskis, "EyePliances: attention-seeking devices that respond to visual attention," in extended abstracts of 2003 ACM SIGCHI Conf. on Human factors in computing systems, Fort Lauderdale, FL, USA, pp. 770-771, Apr. 2003.
[13] R.J. Orr and G.D. Abowd, "The smart floor: a mechanism for natural user identification and tracking," in extended abstracts of 2000 ACM SIGCHI Conf. on Human factors in computing systems, Hague, Netherlands, pp. 275-276, Apr. 1-6, 2000.
[14] M. Yang, D.J. Kriegman and N. Ahuja, "Detecting faces in images: a survey," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, pp. 34-58, Jan 2002.
[15] P. Viola and M.J. Jones, "Robust Real-Time Face Detection," Int. J. Computer Vision, vol. 57, pp. 137-154, 2004.
[16] S. Baluja, M. Sahami and H. Rowley A., "Efficient Face Orientation Discrimination," in Proc. 2004 Int. Conf. on Image Processing, Singapore, vol. 1, pp. 589-592, Oct. 24-27, 2004.
[17] B.D. Zarit, B.J. Super and F.K.H. Quek, "Comparison of Five Color Models in Skin Pixel Classification," in Proc. Int. Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999, Corfu, Greece, pp. 58, Sep. 26-27, 1999.
[18] J. Yang, R. Stiefelhagen, U. Meier and A. Waibel, "Visual tracking for multimodal human computer interaction," in Proc. 1998 ACM SIGCHI Conf. on Human Factors in Computing Systems, Los Angeles, CA, USA, pp. 140-147, Apr. 18-23, 1998
[19] Y. Cheng, "Mean Shift, Mode Seeking, and Clustering," IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 17, pp. 790-799, 1995.
[20] G.R. Bradski, "Real Time Face and Object Tracking as a Component of a Perceptual User Interface," in Proc. 4th IEEE Workshop on Applications of Computer Vision, pp. 214, Oct. 1998.
[21] J.L. Barron, D.J. Fleet and S.S. Beauchemin, "Performance of optical flow techniques," Int. J. Computer Vision, vol. 12, pp. 43-77, 1994.
[22] F. Bourel, C. Chibelushi C. and A. Low A., "Robust Facial Feature Tracking," in Proc. 11th British Machine Vision Conference, Bristol, UK, vol. 1, pp. 232-241, Sep. 2000.
[23] D. Lucas and T. Kanade, "An iterative image registration technique with an application in stereo vision," in Proc. 1981 Int. Joint Conf. on Artificial Intelligence , pp. 674-679, 1981.
[24] J. Shi and C. Tomasi, "Good features to track," in Proc. 1994 IEEE Conf. on Computer Vision and Pattern Recognition, Seattle, Washington, USA, pp. 593-600, Jun. 20-24, 1994.
[25] Julien Letessier and François Bérard, "Visual Tracking of Bare Fingers for Interactive Surfaces", in Proc. 17th annual ACM symposium on User Interface Software and Technology, Santa Fe, New Mexico, USA, pp. 119-122, Oct. 24-27, 2004
[26] Shinjiro Kawato and Jun Ohya, "Real-time Detection of Nodding and Head-shaking by Directly Detecting and Tracking the “Between-Eyes”", in Proc. 4th Int. Conf. on Automatic Face and Gesture Recognition, Grenoble, France, pp.40-45, Mar. 28-30, 2000
[27] Ashish Kapoor and Rosalind W. Picard, "Real-Time Head Nod and Shake Detector", in Proc. 2001 workshop on Perceptive User interfaces, Orlando, Florida, USA, pp1-5, Nov. 15-16, 2001
描述 碩士
國立政治大學
資訊科學學系
93753019
94
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0937530191
資料類型 thesis
dc.contributor.advisor 廖文宏zh_TW
dc.contributor.advisor Liao, Wen-Hungen_US
dc.contributor.author (Authors) 葉致偉zh_TW
dc.contributor.author (Authors) Yeh, Chih-Weien_US
dc.creator (作者) 葉致偉zh_TW
dc.creator (作者) Yeh, Chih-Weien_US
dc.date (日期) 2005en_US
dc.date.accessioned 17-Sep-2009 14:09:34 (UTC+8)-
dc.date.available 17-Sep-2009 14:09:34 (UTC+8)-
dc.date.issued (上傳時間) 17-Sep-2009 14:09:34 (UTC+8)-
dc.identifier (Other Identifiers) G0937530191en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/32732-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學學系zh_TW
dc.description (描述) 93753019zh_TW
dc.description (描述) 94zh_TW
dc.description.abstract (摘要) 以肢體動作為基礎的人機介面一直被認為是未來家庭人機介面的表徵,然而,由於缺乏適合的應用環境和辨識技術,相關的應用尚未成熟。本研究嘗試提出一個互動式仿鏡顯示裝置作為肢體指令的平台,並提出相關的辨識技術,以設計一個可應用在智慧家庭環境中,符合人因工程的互動顯示裝置。zh_TW
dc.description.abstract (摘要) Gesture-based user interfaces have long been associated with the image of future technology. However, due to the lack of proper environments and recognition technologies, practical applications of intelligent user interfaces are still rare in modern life. In this research, we propose an interactive mirror which can be controlled by gesture commands. We also provide several recognition techniques for this interactive display device. Practical applications are developed on this smart mirror, and user test is conducted to evaluate this novel user interface.en_US
dc.description.tableofcontents CHAPTER 1 INTRODUCTION 1
1.1 The Magic Mirror 2
1.2 Related Work 5
1.2.1 Mirror TV 6
1.2.2 Home Appliances Controlled by Intelligent User Interfaces 7
1.3 Overview 9
CHAPTER 2 FACE DETECTION AND TRACKING 10
2.1 Face Detection 10
2.1.1 AdaBoosting Face Detector 11
2.1.2 Skin-Color Face Filter 13
2.2 Face Tracking 16
2.2.1 Mean Shift Tracker 16
2.2.2 Local Feature Tracker 18
2.3 Performance Evaluation 20
CHAPTER 3 GESTURE RECOGNITION 25
3.1 Gesture Commands 25
3.2 Gesture Pointing 27
3.3 Gesture Selection 31
3.3.1 Nodding and Head-shaking Detection 31
CHAPTER 4 INFORMATION FUSION AND SPECIAL EFFECTS 33
4.1 Information Fusion 33
4.1.1 Message of The Day 33
4.1.2 News Clip Playback 34
4.2 Special Effects 35
4.2.1 Wear-a-Tie 35
4.2.2 Facial Complexion Enhancement 36
CHAPTER 5 DISPLAY MANAGER 39
5.1 Display Guidelines 39
5.2 Attentive Levels 41
CHAPTER 6 USER TEST 43
6.1 The Response Time Test 43
CHAPTER 7 CONCLUSION AND FUTURE WORK 46
REFERENCE 48
zh_TW
dc.format.extent 73386 bytes-
dc.format.extent 73154 bytes-
dc.format.extent 72331 bytes-
dc.format.extent 84111 bytes-
dc.format.extent 258113 bytes-
dc.format.extent 434401 bytes-
dc.format.extent 209028 bytes-
dc.format.extent 217898 bytes-
dc.format.extent 215503 bytes-
dc.format.extent 81597 bytes-
dc.format.extent 75434 bytes-
dc.format.extent 110588 bytes-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.language.iso en_US-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0937530191en_US
dc.subject (關鍵詞) 人機介面zh_TW
dc.subject (關鍵詞) 智慧顯示裝置zh_TW
dc.subject (關鍵詞) 智慧家電zh_TW
dc.subject (關鍵詞) 肢體人機介面zh_TW
dc.subject (關鍵詞) Human Computer Interfaceen_US
dc.subject (關鍵詞) Smart Display Deviceen_US
dc.subject (關鍵詞) Smart Furnitureen_US
dc.subject (關鍵詞) Gesture User Intefaceen_US
dc.title (題名) 智慧型仿鏡互動顯示裝置zh_TW
dc.title (題名) Magic Mirror: A Research on Smart Display Devicesen_US
dc.type (資料類型) thesisen
dc.relation.reference (參考文獻) [1] E. Lee, T. Nakra Marrin and J. Borchers, "You`re the conductor: A realistic interactive conducting system for children," in Proc. 2004 Int. Conf. on New Interfaces for Musical Expression, Hamamatsu, Japan, pp. 68-73, Jun. 2004.zh_TW
dc.relation.reference (參考文獻) [2] V. Henderson, S. Lee, H. Brashear, H. Hamilton, T. Starner and S. Hamilton, "Development of an American Sign Language game for deaf children," in Proc. 4th Int. Conf. on Interaction Design and Children, Boulder, CO, USA, pp. 70-79, Jun. 2005.zh_TW
dc.relation.reference (參考文獻) [3] L. Zhang, Y. Chen, G. Fang, X. Chen and W. Gao, "A vision-based sign language recognition system using tied-mixture density HMM," in Proc. 6th Int. Conf. on Multimodal Interfaces, State College, PA, USA, pp. 198-204, Oct. 2004.zh_TW
dc.relation.reference (參考文獻) [4] W. Freeman and C. Weissman, "Television control by hand gestures," in Proc. IEEE Int. Workshop on Automatic Face and Gesture Recognition, 1995, pp. 179-183, Jun. 1995.zh_TW
dc.relation.reference (參考文獻) [5] A. Wilson and N. Oliver, "GWindows: robust stereo vision for gesture-based control of windows," in Proc. 5th Int. Conf. on Multimodal Interfaces, Vancouver, British Columbia, Canada, pp. 211-218, Nov. 5-7 2003.zh_TW
dc.relation.reference (參考文獻) [6] R. Vertegaal, "Attentive user interfaces: Introduction," Commun. ACM, vol. 46, pp. 30-33, 2003.zh_TW
dc.relation.reference (參考文獻) [7] http://www.research.philips.com/technologies/display/mrrordisp/mirrortv/zh_TW
dc.relation.reference (參考文獻) [8] http://www.seuratvmirror.comzh_TW
dc.relation.reference (參考文獻) [9] T. Darrell, G. Gorden, J. Woodfill and M. Harville, "A Virtual Mirror Interface using Real-time Robust Face Tracking," in Proc. 3th IEEE Int. Conf. on Automatic Face and Gesture Recognition, Nara, Japan, pp. 616-621, Apr. 14-16, 1998.zh_TW
dc.relation.reference (參考文獻) [10] W. Liao and T. Y. Li, "DD Theme Party: An Interactive Multimedia Showcase," in Proc. 2005 Int. Conf. on Digital Archive Technologies, Taipei, Taiwan, pp. 279-280, Jun. 16-17, 2005.zh_TW
dc.relation.reference (參考文獻) [11] J. Goto, K. Komine, Y. Kim and N. Uratani, "A Television Control System based on Spoken Natural Language Dialogue," in Proc. 9th IFIP TC13 Int. Conf. on Human-Computer Interaction, Zürich, Switzerland, pp. 765-768, Sep. 1-5. 2003.zh_TW
dc.relation.reference (參考文獻) [12] J.S. Shell, R. Vertegaal and A.W. Skaburskis, "EyePliances: attention-seeking devices that respond to visual attention," in extended abstracts of 2003 ACM SIGCHI Conf. on Human factors in computing systems, Fort Lauderdale, FL, USA, pp. 770-771, Apr. 2003.zh_TW
dc.relation.reference (參考文獻) [13] R.J. Orr and G.D. Abowd, "The smart floor: a mechanism for natural user identification and tracking," in extended abstracts of 2000 ACM SIGCHI Conf. on Human factors in computing systems, Hague, Netherlands, pp. 275-276, Apr. 1-6, 2000.zh_TW
dc.relation.reference (參考文獻) [14] M. Yang, D.J. Kriegman and N. Ahuja, "Detecting faces in images: a survey," IEEE Trans. on Pattern Analysis and Machine Intelligence, vol. 24, pp. 34-58, Jan 2002.zh_TW
dc.relation.reference (參考文獻) [15] P. Viola and M.J. Jones, "Robust Real-Time Face Detection," Int. J. Computer Vision, vol. 57, pp. 137-154, 2004.zh_TW
dc.relation.reference (參考文獻) [16] S. Baluja, M. Sahami and H. Rowley A., "Efficient Face Orientation Discrimination," in Proc. 2004 Int. Conf. on Image Processing, Singapore, vol. 1, pp. 589-592, Oct. 24-27, 2004.zh_TW
dc.relation.reference (參考文獻) [17] B.D. Zarit, B.J. Super and F.K.H. Quek, "Comparison of Five Color Models in Skin Pixel Classification," in Proc. Int. Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, 1999, Corfu, Greece, pp. 58, Sep. 26-27, 1999.zh_TW
dc.relation.reference (參考文獻) [18] J. Yang, R. Stiefelhagen, U. Meier and A. Waibel, "Visual tracking for multimodal human computer interaction," in Proc. 1998 ACM SIGCHI Conf. on Human Factors in Computing Systems, Los Angeles, CA, USA, pp. 140-147, Apr. 18-23, 1998zh_TW
dc.relation.reference (參考文獻) [19] Y. Cheng, "Mean Shift, Mode Seeking, and Clustering," IEEE Trans. On Pattern Analysis and Machine Intelligence, vol. 17, pp. 790-799, 1995.zh_TW
dc.relation.reference (參考文獻) [20] G.R. Bradski, "Real Time Face and Object Tracking as a Component of a Perceptual User Interface," in Proc. 4th IEEE Workshop on Applications of Computer Vision, pp. 214, Oct. 1998.zh_TW
dc.relation.reference (參考文獻) [21] J.L. Barron, D.J. Fleet and S.S. Beauchemin, "Performance of optical flow techniques," Int. J. Computer Vision, vol. 12, pp. 43-77, 1994.zh_TW
dc.relation.reference (參考文獻) [22] F. Bourel, C. Chibelushi C. and A. Low A., "Robust Facial Feature Tracking," in Proc. 11th British Machine Vision Conference, Bristol, UK, vol. 1, pp. 232-241, Sep. 2000.zh_TW
dc.relation.reference (參考文獻) [23] D. Lucas and T. Kanade, "An iterative image registration technique with an application in stereo vision," in Proc. 1981 Int. Joint Conf. on Artificial Intelligence , pp. 674-679, 1981.zh_TW
dc.relation.reference (參考文獻) [24] J. Shi and C. Tomasi, "Good features to track," in Proc. 1994 IEEE Conf. on Computer Vision and Pattern Recognition, Seattle, Washington, USA, pp. 593-600, Jun. 20-24, 1994.zh_TW
dc.relation.reference (參考文獻) [25] Julien Letessier and François Bérard, "Visual Tracking of Bare Fingers for Interactive Surfaces", in Proc. 17th annual ACM symposium on User Interface Software and Technology, Santa Fe, New Mexico, USA, pp. 119-122, Oct. 24-27, 2004zh_TW
dc.relation.reference (參考文獻) [26] Shinjiro Kawato and Jun Ohya, "Real-time Detection of Nodding and Head-shaking by Directly Detecting and Tracking the “Between-Eyes”", in Proc. 4th Int. Conf. on Automatic Face and Gesture Recognition, Grenoble, France, pp.40-45, Mar. 28-30, 2000zh_TW
dc.relation.reference (參考文獻) [27] Ashish Kapoor and Rosalind W. Picard, "Real-Time Head Nod and Shake Detector", in Proc. 2001 workshop on Perceptive User interfaces, Orlando, Florida, USA, pp1-5, Nov. 15-16, 2001zh_TW