Please use this identifier to cite or link to this item: https://ah.nccu.edu.tw/handle/140.119/126581


Title: 3D互動敘事中以穿戴式裝置與虛擬角色互動之機制設計
Using Wearable Devices to Interact with Virtual Agents in 3D Interactive Storytelling
Authors: 王玟璇
Wang, Wen-Hsuan
Contributors: 李蔡彥
Li, Tsai-Yen
王玟璇
Wang, Wen-Hsuan
Keywords: 互動敘事
虛擬實境
穿戴式裝置
電腦動畫
Date: 2019
Issue Date: 2019-10-03 17:17:57 (UTC+8)
Abstract: 近年來,越來越多的產業加入虛擬實境技術的開發與應用,例如職場訓練模擬或遊戲娛樂等,但大多數的應用,通常是利用手把按鍵或給定的動作選項與環境中的物件及NPC互動,故事體驗者選擇動作後,NPC給予的回應多只是制式的罐頭動畫或者是單純的語音文字輸出。
我們認為此般互動並不能讓故事體驗者真正融入虛擬世界當中,因此,我們提議能夠利用穿戴式動作捕捉設備,讓玩家能以自然的動作當作輸入,並將虛擬人物的動畫模組透過參數化的方式,讓動畫模組能透過參數的變化而有更多元的呈現輸出,並讓相同的人物與場景,會因為與故事體驗者進行不同的互動而呈現出不同的劇情發展及動畫回饋。
我們實現了一套系統,讓故事體驗者利用穿戴式裝置輸入肢體動作,系統解析動作後,決定玩家角色的動畫呈現,以及判斷是否有觸發NPC的互動事件,根據互動過程的不同導向不同的結局。實驗利用穿戴式裝置與VIVE控制器兩種不同輸入媒介來做比較,受試者完成體驗後填寫問卷以及接受訪談,最後分析實驗結果,驗證了我們設計的互動方式是直覺且順暢的,並且受試者會想要嘗試不同的故事路徑,證明了我們的系統有重玩的價值。
In recent year, more and more industries and companies are devoted to the de-velopment of Virtual Reality in applications such as work training and entertainment. However, most of them use traditional user interfaces such as buttons or predefined action sequences to interact with virtual agents. When a player has chosen her move-ment, the responses from NPC’s are usually fixed animations, voice, or text outputs.
We think this kind of interaction could not allow players to immerse into a virtual world easily. Instead, we suggest using wearable devices to capture the player’s ges-ture and use her natural movements as inputs. In addition, we attempt to make the animation module of virtual character parameterizable in order to deliver appropriate, flexible, and diversified responses. We hope that the player can experience different story plots and perceive responsive animation feedbacks when they interact with the virtual world.
We have implemented an interactive storytelling system which captures and in-terprets user’s body actions through wearable devices. The system can decide how to perform player character’s animation accordingly. The storyline will be adjusted if any NPC interactions are activated, thus leading to different story experiences. We have conducted a user study to evaluate our system by using traditional controller and wearable device for comparison. The participants evaluated the system by filling ques-tionnaires and were interviewed after the experiment. The experimental results reveal that the interaction methods we have designed are intuitive and easy to use for the users, compared to the controller. In addition, the users are willing to try to play with the system multiple times, which confirm the replay value of our interactive storytell-ing system.
Reference: [1] F. Kistler, D. Sollfrank, N. Bee, E. André, "Full Body Gestures enhancing a Game Book for Interactive Story Telling," in International Conference on Interactive Digital Storytelling, 2011, pp.207-218.
[2] C. Mousas, C.-N. Anagnostopoulos, "Performance-Driven Hybrid Full-Body Character Control for Navigation and Interaction in Virtual Environments," 3D Research, 8(2), Article No. 124, 2017.
[3] H. Rhodin, J. Tompkin, K. I. Kim, E. de Aguiar, H. Pfister, H.-P. Seidel, C. Theobalt, "Generalizing Wave Gestures from Sparse Examples for Real-time Character Control," in Proceedings of ACM SIGGRAPH Asia 2015, 34(6), Article No. 181, 2015.
[4] D. Thalmann, "Motion Modeling: Can We Get Rid of Motion Capture?," in In-ternational Workshop on Motion in Games, 2008, pp.121-131.
[5] S. Tonneau, R. A. Al-Ashqar, J. Pettré,. T. Komura, N. Mansard, "Character contact re-positioning under large environment deformation," in Proceedings of the 37th Annual Conference of the European Association for Computer Graphics, 2016, pp127-138.
[6] A. Shoulson, N. Marshak, M. Kapadia, N. I. Badler, "Adapt: the agent developmentand prototyping testbed," IEEE Transactions on Visualization and Computer Graphics 20.7 , 2014, pp.1035-1047.
[7] M. Kapadia, X. Xu, M. Nitti, M. Kallmann, S. Coros, RW. Sumner, MH. Gross, "PRECISION: Precomputing Environment Semantics for Contact-Rich Character Animation," in Proceedings of the 20th ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, 2016, pp.29-37.
[8] C. Mousas, "Towards Developing an Easy-To-Use Scripting Environment for Animating Virtual Characters," arXiv preprint arXiv:1702.03246, 2017.
[9] 楊奇珍, "以體感方式參與敘事的3D互動敘事系統," 國立政治大學資訊科學系碩士論文, 2015.
[10] M. Kipp, A. Heloir, M. Schroder, P. Gebhard, "Realizing Multimodal Behavior," in International Conference on Intelligent Virtual Agents, 2010, pp.57-63.
[11] J. Funge, X. Tu, D. Terzopoulos, "Cognitive modeling: knowledge, reasoning and planning for intelligent characters," in Computer graphics and interactive tech-niques, 1999, pp.29-38.
[12] 梁芳綺, "互動敘事中智慧型共同創作平台設計," 國立政治大學資訊科學系碩士論文, 2015.
[13] 蘇雅雯, "互動敘事中具沉浸感之互動動畫產生研究," 國立政治大學資訊科學系碩士論文, 2017.
[14] E. Brown, P. Cairns, " A grounded investigation of game immersion," in Extended Abstracts on Human Factors in Computing Systems, 2004, pp.1297-1300.
[15] C. Jennett, A. L. Cox , P. Cairns, S. Dhoparee, A. Epps, T. Tijs, and A. Walton, "Measuring and defining the experience of immersion in games," International Journal of Human-Computer Studies, 66(9):641-661, 2008.
Description: 碩士
國立政治大學
資訊科學系
105753004
Source URI: http://thesis.lib.nccu.edu.tw/record/#G0105753004
Data Type: thesis
Appears in Collections:[資訊科學系] 學位論文

Files in This Item:

File SizeFormat
300401.pdf3179KbAdobe PDF148View/Open


All items in 學術集成 are protected by copyright, with all rights reserved.


社群 sharing