學術產出-學位論文

題名 利用生理感測資料之線上情緒辨識系統
On-line Emotion Recognition System by Physiological Signals
作者 陳建家
Chen, Jian Jia
貢獻者 蔡子傑
Tsai, Tzu Chieh
陳建家
Chen, Jian Jia
關鍵詞 情緒
生理感測器
演算法
即時辨識
智慧型生活環境
emotion
physiological sensors
algorithm
on-line recognition
smart environment
日期 2008
上傳時間 17-九月-2009 14:05:43 (UTC+8)
摘要 貼心的智慧型生活環境,必須能在不同的情緒狀態提供適當服務,因此我們希望能開發出一個情緒辨識系統,透過對於形於外的生理感測資料的變化來觀察形於內的情緒狀態。
首先我們採用國際情緒圖庫系統(IAPS: International Affective Picture System) 及維度式分析方法,透過心理實驗的操弄,收集了20位的受測者生理數值與主觀評定情緒的強度與正負向。我們提出了一個情緒辨識學習演算法,經由交叉驗證訓練出每個情緒的特徵,並藉由即時測試資料來修正情緒特徵的個人化,經由學習趨勢的評估,準確率有明顯提升。其次,我們更進一步引用了維度式與類別式情緒的轉換概念來驗證受測者主觀評定的結果。相較於相關研究實驗結果,我們在維度式上的強度與正負向辨識率有較高的表現,在類別式上的驗證我們也達到明顯區分效果。
更重要的是,我們所實作出的系統,是搭載了無線生理感測器,使用時更具行動性,而且可即時反映情緒,提供線上智慧型服務。
A living smart environment should be able to provide thoughtful services by considering different states of emotions. The goal of our research is to develop an emotion recognition system which can detect the internal emotion states from external varieties of physiological data.
First we applied the dimensional analysis approach and adopted IAPS (International Affective Picture System) to manipulate psychological experiments. We collected physiological data and subjective ratings for arousal and valence from 20 subjects. We proposed an emotion recognition learning algorithm. It would extract each pattern of emotions from cross validation training and can further learn adaptively by feeding personalized testing data. We measured the learning trend of each subject. The recognition rate reveals incremental enhancement. Furthermore, we adopted a dimensional to discrete emotion transforming concept for validating the subjective rating. Compared to the experiment results of related works, our system outperforms both in dimensional and discrete analyses.
Most importantly, the system is implemented based on wireless physiological sensors for mobile usage. This system can reflect the image of emotion states in order to provide on-line smart services.
參考文獻 [1] J. LeDoux, The Emotional Brain. New York: Simon & Schuster, 1996.
[2] P. Salovery and J. D. Mayer, “Emotional intelligence”, Imagination, Cognition and Personality, vol. 9, no. 3, pp. 185-211, 1990.
[3] D. Goleman, Emotional Intelligence. New York: Bantam Books, 1995.
[4] K. R. Scherer. Ch. 10: Speech and emotional states. In J. K. Darby, editor, Speech Evaluation in psychiatry, pages 189-220. Grune and Stratton, Inc., 1981.
[5] Y. Yacoob and L.S. Davis. Recognizing human facial expressions from log image sequences using optical flow. Transactions on Pattern Recognition and Machine Intelligence, 18(6):636-642, June 1996.
[6] I. A. Essa and A. Pentland. Facial expression recognition using a dynamic model and motion enerty. In International Conference on Computer Vision, pates 360-367, Cambridge MA, 1995, IEEE Computer Society.
[7] R. W. Picard. Affective Computing. The MIT Press, Cambridge, MA, 1997.
[8] Christian Martyn Jones and Tommy Troen, Biometric Valence and Arousal Recognition , ACM International Conference Proceeding Series; Vol. 251, Proceedings of the 2007 conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction.
[9] Corinna Cortes and V. Vapnik, "Support-Vector Networks, Machine Learning, 20, 1995
[10] Peng, H.C., Long, F., and Ding, C., "Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 27, No. 8, pp.1226-1238, 2005.
[11] R.O. Duda and P.E. Hart. Pattern Classification and Scene Analysis. Wiley-Interscience, 1978.
[12] Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term. monitoring of physiological signals. Med Biol Eng Compute 42 (2004).
[13] Lang, P.J., Bradley, M. M. and Cuthbert, B. N.. International affective picture system (IAPS): Technical Manual and Affective Ratings, Center for Research in Psychophysiology, University of Florida (1999).
[14] Kohavi, Ron (1995). "A study of cross-validation and bootstrap for accuracy estimation and model selection". Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence 2 (12): 1137–1143.(Morgan Kaufmann, San Mateo).
[15] Chang, J., Luo, Y., and Su, K. 1992. GPSM: a Generalized Probabilistic Semantic Model for ambiguity resolution. In Proceedings of the 30th Annual Meeting on Association For Computational Linguistics (Newark, Delaware, June 28 - July 02, 1992). Annual Meeting of the ACL. Association for Computational Linguistics, Morristown, NJ, 177-184
[16] Devijver, P. A., and J. Kittler, Pattern Recognition: A Statistical Approach, Prentice-Hall, London, 1982.
[17] 徐世平 Master thesis,“Application of Physiological Signal Monitoring in Smart Living Space”, Jun,2008.
描述 碩士
國立政治大學
資訊科學學系
95753038
97
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0095753038
資料類型 thesis
dc.contributor.advisor 蔡子傑zh_TW
dc.contributor.advisor Tsai, Tzu Chiehen_US
dc.contributor.author (作者) 陳建家zh_TW
dc.contributor.author (作者) Chen, Jian Jiaen_US
dc.creator (作者) 陳建家zh_TW
dc.creator (作者) Chen, Jian Jiaen_US
dc.date (日期) 2008en_US
dc.date.accessioned 17-九月-2009 14:05:43 (UTC+8)-
dc.date.available 17-九月-2009 14:05:43 (UTC+8)-
dc.date.issued (上傳時間) 17-九月-2009 14:05:43 (UTC+8)-
dc.identifier (其他 識別碼) G0095753038en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/32703-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學學系zh_TW
dc.description (描述) 95753038zh_TW
dc.description (描述) 97zh_TW
dc.description.abstract (摘要) 貼心的智慧型生活環境,必須能在不同的情緒狀態提供適當服務,因此我們希望能開發出一個情緒辨識系統,透過對於形於外的生理感測資料的變化來觀察形於內的情緒狀態。
首先我們採用國際情緒圖庫系統(IAPS: International Affective Picture System) 及維度式分析方法,透過心理實驗的操弄,收集了20位的受測者生理數值與主觀評定情緒的強度與正負向。我們提出了一個情緒辨識學習演算法,經由交叉驗證訓練出每個情緒的特徵,並藉由即時測試資料來修正情緒特徵的個人化,經由學習趨勢的評估,準確率有明顯提升。其次,我們更進一步引用了維度式與類別式情緒的轉換概念來驗證受測者主觀評定的結果。相較於相關研究實驗結果,我們在維度式上的強度與正負向辨識率有較高的表現,在類別式上的驗證我們也達到明顯區分效果。
更重要的是,我們所實作出的系統,是搭載了無線生理感測器,使用時更具行動性,而且可即時反映情緒,提供線上智慧型服務。
zh_TW
dc.description.abstract (摘要) A living smart environment should be able to provide thoughtful services by considering different states of emotions. The goal of our research is to develop an emotion recognition system which can detect the internal emotion states from external varieties of physiological data.
First we applied the dimensional analysis approach and adopted IAPS (International Affective Picture System) to manipulate psychological experiments. We collected physiological data and subjective ratings for arousal and valence from 20 subjects. We proposed an emotion recognition learning algorithm. It would extract each pattern of emotions from cross validation training and can further learn adaptively by feeding personalized testing data. We measured the learning trend of each subject. The recognition rate reveals incremental enhancement. Furthermore, we adopted a dimensional to discrete emotion transforming concept for validating the subjective rating. Compared to the experiment results of related works, our system outperforms both in dimensional and discrete analyses.
Most importantly, the system is implemented based on wireless physiological sensors for mobile usage. This system can reflect the image of emotion states in order to provide on-line smart services.
en_US
dc.description.tableofcontents CHAPTER 1 Introduction ...1
1.1. Background ....1
1.1.1. Emotion ..1
1.1.2. Physiological Signals ...2
1.1.3. Physiological Pattern ...6
1.2. Motivation ....7
1.3. Organization ....7
CHAPTER 2 Related Work ....9
2.1. Pattern recognition ....10
2.1.1. Features selection ....10
2.1.2. Fisher Linear Discriminant Analysis ....11
2.1.3. Support Vector Machine ....13
2.2. Emotion Induction ....16
CHAPTER 3 On-line Emotion Recognition algorithm ....18
3.1. Training phase ....19
3.1.1. Physiological data ....19
3.1.2. Feature Extraction ....20
3.1.3. Cross Validation ....21
3.1.4. Remove Ambiguous Data ....23
3.1.5. Threshold ....23
3.2. Testing Phase ....24
3.2.1. Training Model ....25
3.2.2. Model Rebuild ....26
3.2.3. Learning Trend (k) ....27
3.3. Dimensional Emotion to Discrete Emotion ....28
3.3.1. International Affective Picture System (IAPS) ....29
3.3.2. Probability Density Function ....31
CHAPTER 4 Recognition System Implementation ....36
4.1. System Structure ....36
4.1.1. Training instances ....37
4.1.2. The Application Program Interface (API) ....39
4.2. Recognizing the Incoming Instances ....40
CHAPTER 5 Experimental Evaluation ....41
5.1. Experimental Setup ....41
5.2. Experiment Results ....44
5.2.1. Results of Recognition rate and Learning rate on valence ....44
5.2.2. Results of Recognition rate and Learning rate in arousal ....47
5.2.3. Result of Dimensional Emotion to Discrete Emotion ....50
CHAPTER 6 Conclusions ....53
References ....54


LIST OF TABLES

Table 3 1: The mean and standard deviation of valence for discrete emotion states from IAPS. ....31
Table 3 2: Mean and standard deviation for valence and arousal from IAPS ....33
Table 5 1: Experimental Setup ....42
Table 5 2: Comparison of recognition in different valence. ....46
Table 5 3 : Comparison of recognition in different arousal. ....49
Table 5 4: Statistic parameter of arousal and valence. ....50



LIST OF FIGURES

Figure 1 1: Physiological signal devices ....3
Figure 1 2: ECG (Electrokardiogram) ....3
Figure 1 3: BVP (Blood Volume Pulse) ....4
Figure 1 4: EMG (Electromyogram) ....4
Figure 1 5: SC or SCL (Skin Conductivity) ....5
Figure 1 6: RESP (Respiration) ....5
Figure 2 1: Features data set. ....11
Figure 3 1: Proposed algorithm ....18
Figure 3 2: Physiological sensor (a) Respiration (RESP) , (b) MULTI (BVP, SCL, TEMP) (c) EMG (EMG1, EMG2). ....19
Figure 3 3: The training phase ....21
Figure 3 4: Removing farthest instance from gravity. ....23
Figure 3 5: The default model modification. ....25
Figure 3 6: The default model modification detail. ....26
Figure 3 7: An example of k. ....28
Figure 3 8: Emotion model ....29
Figure 3 9: IAPS average rating distribution. ....30
Figure 3 10: Discrete emotion states distribution. ....32
Figure 3 11: The probability distribution of surprise for arousal and valence. ....34
Figure 3 12: The probability distribution of sad for arousal and valence. ....35
Figure 4 1: System implementation ....36
Figure 4 2: Application flow ....38
Figure 4 3: System implementation blocks ....39
Figure 5 1: IAPS database. ....41
Figure 5 2: Subjective rating table. Class (a). Valence (b). Arousal (c). ....42
Figure 5 3 : System server GUI ....43
Figure 5 4 : Objective learning rate of valence ....44
Figure 5 5 : Subjective learning rate of valence ....45
Figure 5 6 : Comparison of learning rate of valence. ....45
Figure 5 7 : Recognition of each valence ....46
Figure 5 8: Subjective learning rate of arousal. ....47
Figure 5 9: Objective learning rate of arousal ....48
Figure 5 10 : Comparison of learning rate of valence. ....48
Figure 5 11 : Recognition rate of each arousal. ....49
Figure 5 12: Dimensional to discrete emotion ....51
zh_TW
dc.format.extent 98711 bytes-
dc.format.extent 63760 bytes-
dc.format.extent 118799 bytes-
dc.format.extent 125243 bytes-
dc.format.extent 322419 bytes-
dc.format.extent 534579 bytes-
dc.format.extent 785487 bytes-
dc.format.extent 281718 bytes-
dc.format.extent 377546 bytes-
dc.format.extent 64463 bytes-
dc.format.extent 86503 bytes-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.language.iso en_US-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0095753038en_US
dc.subject (關鍵詞) 情緒zh_TW
dc.subject (關鍵詞) 生理感測器zh_TW
dc.subject (關鍵詞) 演算法zh_TW
dc.subject (關鍵詞) 即時辨識zh_TW
dc.subject (關鍵詞) 智慧型生活環境zh_TW
dc.subject (關鍵詞) emotionen_US
dc.subject (關鍵詞) physiological sensorsen_US
dc.subject (關鍵詞) algorithmen_US
dc.subject (關鍵詞) on-line recognitionen_US
dc.subject (關鍵詞) smart environmenten_US
dc.title (題名) 利用生理感測資料之線上情緒辨識系統zh_TW
dc.title (題名) On-line Emotion Recognition System by Physiological Signalsen_US
dc.type (資料類型) thesisen
dc.relation.reference (參考文獻) [1] J. LeDoux, The Emotional Brain. New York: Simon & Schuster, 1996.zh_TW
dc.relation.reference (參考文獻) [2] P. Salovery and J. D. Mayer, “Emotional intelligence”, Imagination, Cognition and Personality, vol. 9, no. 3, pp. 185-211, 1990.zh_TW
dc.relation.reference (參考文獻) [3] D. Goleman, Emotional Intelligence. New York: Bantam Books, 1995.zh_TW
dc.relation.reference (參考文獻) [4] K. R. Scherer. Ch. 10: Speech and emotional states. In J. K. Darby, editor, Speech Evaluation in psychiatry, pages 189-220. Grune and Stratton, Inc., 1981.zh_TW
dc.relation.reference (參考文獻) [5] Y. Yacoob and L.S. Davis. Recognizing human facial expressions from log image sequences using optical flow. Transactions on Pattern Recognition and Machine Intelligence, 18(6):636-642, June 1996.zh_TW
dc.relation.reference (參考文獻) [6] I. A. Essa and A. Pentland. Facial expression recognition using a dynamic model and motion enerty. In International Conference on Computer Vision, pates 360-367, Cambridge MA, 1995, IEEE Computer Society.zh_TW
dc.relation.reference (參考文獻) [7] R. W. Picard. Affective Computing. The MIT Press, Cambridge, MA, 1997.zh_TW
dc.relation.reference (參考文獻) [8] Christian Martyn Jones and Tommy Troen, Biometric Valence and Arousal Recognition , ACM International Conference Proceeding Series; Vol. 251, Proceedings of the 2007 conference of the computer-human interaction special interest group (CHISIG) of Australia on Computer-human interaction.zh_TW
dc.relation.reference (參考文獻) [9] Corinna Cortes and V. Vapnik, "Support-Vector Networks, Machine Learning, 20, 1995zh_TW
dc.relation.reference (參考文獻) [10] Peng, H.C., Long, F., and Ding, C., "Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol. 27, No. 8, pp.1226-1238, 2005.zh_TW
dc.relation.reference (參考文獻) [11] R.O. Duda and P.E. Hart. Pattern Classification and Scene Analysis. Wiley-Interscience, 1978.zh_TW
dc.relation.reference (參考文獻) [12] Kim, K.H., Bang, S.W., Kim, S.R.: Emotion recognition system using short-term. monitoring of physiological signals. Med Biol Eng Compute 42 (2004).zh_TW
dc.relation.reference (參考文獻) [13] Lang, P.J., Bradley, M. M. and Cuthbert, B. N.. International affective picture system (IAPS): Technical Manual and Affective Ratings, Center for Research in Psychophysiology, University of Florida (1999).zh_TW
dc.relation.reference (參考文獻) [14] Kohavi, Ron (1995). "A study of cross-validation and bootstrap for accuracy estimation and model selection". Proceedings of the Fourteenth International Joint Conference on Artificial Intelligence 2 (12): 1137–1143.(Morgan Kaufmann, San Mateo).zh_TW
dc.relation.reference (參考文獻) [15] Chang, J., Luo, Y., and Su, K. 1992. GPSM: a Generalized Probabilistic Semantic Model for ambiguity resolution. In Proceedings of the 30th Annual Meeting on Association For Computational Linguistics (Newark, Delaware, June 28 - July 02, 1992). Annual Meeting of the ACL. Association for Computational Linguistics, Morristown, NJ, 177-184zh_TW
dc.relation.reference (參考文獻) [16] Devijver, P. A., and J. Kittler, Pattern Recognition: A Statistical Approach, Prentice-Hall, London, 1982.zh_TW
dc.relation.reference (參考文獻) [17] 徐世平 Master thesis,“Application of Physiological Signal Monitoring in Smart Living Space”, Jun,2008.zh_TW