Publications-Theses

題名 分析音樂特徵尋找將情緒引導至正向之音樂
Finding Music for Leading Personal Emotions to Positive Valence by Analyzing Music Features
作者 史訓綱
Shih, Hsum Kang
貢獻者 陳良弼
史訓綱
Shih, Hsum Kang
關鍵詞 音樂
情緒
日期 2006
上傳時間 17-Sep-2009 14:03:07 (UTC+8)
摘要 過去有許多研究都指出,音樂具有引導人類情緒的功用。在音樂治療的理論中指出,音樂對於人的心理影響,主要在引導情緒的變化。但以往有關音樂情緒或是音樂特徵的實驗及研究,大多在自動找出音樂內容本身的情緒,以及這些情緒跟音樂本身特徵的關聯性。而本論文的目標則是希望能透過分析音樂特徵,找出能引導人的情緒往正向變化的音樂,並且研究這些音樂的特性以及相互之間的關聯性。
所以,當我們要尋找這些能引導情緒的音樂時,我們首先必須要先定義出音樂的特徵,利用這些特徵來簡化並用來表示一首音樂。並且要能讓那些引導出相同變化的音樂,在透過這樣的特徵表示之後,相似度計算出來的結果也會顯現出彼此之間很相似。首先我們就需要透過實驗,找出每首音樂影響人情緒變化的情形。接著分析這些引導情緒至不同方向的音樂,發現到如果音樂特徵是帶有順序性時,當我們要尋找具有類似引導情緒變化的音樂時,會比較準確。
因此,當我們藉由這些透過實驗,已知能使人產生某種情緒變化的歌曲為基準時,如果有其他更多不同的音樂,我們就能夠判斷這些音樂,有可能會使人產生何種的情緒變化。或許這樣就能夠應用在音樂治療上,提供給音樂治療師另一種選擇音樂的方法。
Many studies have indicated that music plays a guiding role of human emotions. In the theory of music therapy, the psychological impact from music is mainly in guiding emotional changes. But, most of the past studies and experiments, which were about the characteristics and features of music, focus on identifying the emotion of music contents. In this paper, our goal is to find music that leads a person’s emotion to positive side by analyzing music features; we also hope to find out the characteristics and the relationships among music.
Before analyzing the music, we define the music features, and use these features to express the contents and the characteristics of music. Using these features to express music, we assume that the songs that guide emotion to similar changes should be similar to each other. After analyzing the music, we found out that if the music features are sequential, the searching result will be more accurate when we want to find some songs that guide human emotions to similar changes. This finding can be used in music therapy to provide the music therapists an alternative way of choosing music.
參考文獻 [1] 李侃儒, “個人化情緒/情境音樂檢索系統, ” 第五屆數位典藏技術研討會, 2006.
[2] 陳美如主譯, 篠田知璋、加藤美知子主編, “標準音樂治療入門, ” 台北市:五南圖書公司, 2005.
[3] 謝文傑, “音樂治療對心理與身心健康的影響,” http://www.psychpark.org/psy/music.asp , 2002.
[4] 謝俊逢, “音樂療法-理論與方法, ” 台北市:大陸書局, 2003。
[5] H.C. Chen and A. L. P. Chen, “A music recommendation system based on music data grouping and user interests, ” ACM Conference on Information and Knowledge Management, 2001.
[6] T.H. Cormen, C.E. Leiserson, R.L. Rivest, and C. Stein, “Introduction to Algorithms 2nd ed. page 350-356,” Massachusetts Institute of Technology, 2003.
[7] A. Gabrielsson and P.N. Juslin, “Emotional Expression in Music, ” in Handbook of Affective Sciences, Oxford University Press, 2003.
[8] A. Gabrielsson and E. Lindstrom, “The Influence of Musical Structure on Emotional Expression, “ in Music and Emotion, theory and research, 2001.
[9] J.S.R. Jang and H.R. Lee, “Hierarchical Filtering Method for Content-based Music Retrieval via Acoustic Input,” ACM Conference on Multimedia, 2001.
[10] P.N. Juslin, “Cue utilization in communication of emotion in music performance: Relating performance to perception”, J.Experimental Psychology, 26, pp. 1797-1813, 2000.
[11] P.N. Juslin and P. Laukka, “Communication of Emotions in Vocal Expression and Music Performance: Different Channels, Same Code?, ” Psychological Bulletin, 129 (5), pp. 770-814, 2003.
[12] D. Liu, L. Lu, and H.J. Zhang, “Automatic Mood Detection from Acoustic Music Data, ” International Symposium on Music Information Retrieval, 2003.
[13] S.R. Livingstone and A.R. Brown, “Dynamic Response: Real-Time Adaptation for Music Emotion, ” Proceedings of the second Australasian conference on Interactive entertainment, 2005.
[14] N.C. Maddage, C.S. Xu, M.S. Kankanhalli, and X. Shao, “Content-based Music Structure Analysis with Applications to Music Semantics Understanding,” ACM Conference on Multimedia, 2004.
[15] J.D. Morris and M.A. Boone, “The effects of music on emotional response, Brand Attitude, and Purchase Intent In an Emotional Advertising Condition, ” Attitude Self-Assessment Manikin, adsam.com, 1998.
[16] G. Nagler, “Guess chords from midi binaries,” http://www.gnmidi.com/utils/midchord.zip , 1998/1999.
[17] National Institute of Advanced Industrial Science and Technology (AIST), “RWC Music Database,” http://staff.aist.go.jp/m.goto/RWC-MDB/ , 2002.
[18] N. Oliver and F. Flores-Mangas, “MPTrain_a mobile music and physiology-based personal trainer, ” ACM Human-Computer Interaction with Mobile devices and services, 2006.
[19] B. Pardo and W. Birmingham, “Chordal analysis of tonal music, ” Technical Report CSETR43901, Electrical Engineering and Computer Science Department, University of Michigan, 2001.
[20] J. Pickens and T. Crawford, “Harmonic Models for Polyphonic Music Retrieval, ” ACM Conference on Information and Knowledge Management, 2002.
[21] J. Pickens and C. Iliopoulos, “Markov Random Fields and Maximum Entropy Modeling for Music, ” International Symposium on Music Information Retrieval, 2005.
[22] J. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, 39, pp. 1161-1178, 1980.
[23] E. Schubert, “Measurement and Time Series Analysis of Emotion in Music,” University of New South Wales, 1999.
[24] S. Stanley and L. Alison, “The Cambridge Music Guide,” Cambridge University Press, 1990.
[25] M. Steinbach, G.. Karypis, and V. Kumar, “A Comparison of Document Clustering Techniques,” Knowledge Discovery in Data Workshop on Text Mining, 2000.
[26] I.H. Witten and E. Frank, “Data Mining: Practical Machine Learning Tools and Technioques,” 2nd ed., Morgan Kaufmann, 2005.
[27] Y.W. Zhu and M.S. Kankanhalli, “Music Scale Modeling for Melody Matching,” ACM Conference on Multimedia, 2003.
描述 碩士
國立政治大學
資訊科學學系
94753019
95
資料來源 http://thesis.lib.nccu.edu.tw/record/#G0094753019
資料類型 thesis
dc.contributor.advisor 陳良弼zh_TW
dc.contributor.author (Authors) 史訓綱zh_TW
dc.contributor.author (Authors) Shih, Hsum Kangen_US
dc.creator (作者) 史訓綱zh_TW
dc.creator (作者) Shih, Hsum Kangen_US
dc.date (日期) 2006en_US
dc.date.accessioned 17-Sep-2009 14:03:07 (UTC+8)-
dc.date.available 17-Sep-2009 14:03:07 (UTC+8)-
dc.date.issued (上傳時間) 17-Sep-2009 14:03:07 (UTC+8)-
dc.identifier (Other Identifiers) G0094753019en_US
dc.identifier.uri (URI) https://nccur.lib.nccu.edu.tw/handle/140.119/32682-
dc.description (描述) 碩士zh_TW
dc.description (描述) 國立政治大學zh_TW
dc.description (描述) 資訊科學學系zh_TW
dc.description (描述) 94753019zh_TW
dc.description (描述) 95zh_TW
dc.description.abstract (摘要) 過去有許多研究都指出,音樂具有引導人類情緒的功用。在音樂治療的理論中指出,音樂對於人的心理影響,主要在引導情緒的變化。但以往有關音樂情緒或是音樂特徵的實驗及研究,大多在自動找出音樂內容本身的情緒,以及這些情緒跟音樂本身特徵的關聯性。而本論文的目標則是希望能透過分析音樂特徵,找出能引導人的情緒往正向變化的音樂,並且研究這些音樂的特性以及相互之間的關聯性。
所以,當我們要尋找這些能引導情緒的音樂時,我們首先必須要先定義出音樂的特徵,利用這些特徵來簡化並用來表示一首音樂。並且要能讓那些引導出相同變化的音樂,在透過這樣的特徵表示之後,相似度計算出來的結果也會顯現出彼此之間很相似。首先我們就需要透過實驗,找出每首音樂影響人情緒變化的情形。接著分析這些引導情緒至不同方向的音樂,發現到如果音樂特徵是帶有順序性時,當我們要尋找具有類似引導情緒變化的音樂時,會比較準確。
因此,當我們藉由這些透過實驗,已知能使人產生某種情緒變化的歌曲為基準時,如果有其他更多不同的音樂,我們就能夠判斷這些音樂,有可能會使人產生何種的情緒變化。或許這樣就能夠應用在音樂治療上,提供給音樂治療師另一種選擇音樂的方法。
zh_TW
dc.description.abstract (摘要) Many studies have indicated that music plays a guiding role of human emotions. In the theory of music therapy, the psychological impact from music is mainly in guiding emotional changes. But, most of the past studies and experiments, which were about the characteristics and features of music, focus on identifying the emotion of music contents. In this paper, our goal is to find music that leads a person’s emotion to positive side by analyzing music features; we also hope to find out the characteristics and the relationships among music.
Before analyzing the music, we define the music features, and use these features to express the contents and the characteristics of music. Using these features to express music, we assume that the songs that guide emotion to similar changes should be similar to each other. After analyzing the music, we found out that if the music features are sequential, the searching result will be more accurate when we want to find some songs that guide human emotions to similar changes. This finding can be used in music therapy to provide the music therapists an alternative way of choosing music.
en_US
dc.description.tableofcontents 中文摘要………………………………………………………………………i
英文摘要……………………………………………………………………… ii
誌謝…………………………………………………………………………… iii
目錄…………………………………………………………………………… iv
圖目錄………………………………………………………………………… vi
表目錄………………………………………………………………………… vii
第一章 導論…………………………………………………………………… 1
1.1 背景與研究動機……………………………………………………… 1
1.2 研究目的……………………………………………………………… 2
1.3 論文結構……………………………………………………………3
第二章 相關研究……………………………………………………4
2.1 之前音樂檢索方法使用的音樂特徵…………………………………4
2.2 情緒研究及分類………………………………………………………7
2.3 情緒音樂檢索系統………………………………………………… 14
第三章 音樂特徵值及相似度計算…………………………………………17
3.1 選取音樂特徵值………………………………………………………17
3.1.1基本midi特徵值………………………………………………18
3.1.1.1 音量………………………………………………………18
3.1.1.2 音高………………………………………………………18
3.1.2 高階音樂特徵值………………………………………………18
3.1.2.1 音量變化(VC)……………………………………………18
3.1.2.2 音高變化(PC)……………………………………………19
3.1.2.3 合音(HM)…………………………………………………19
3.1.2.4 清晰度(ATC) ……………………………………………19
3.1.2.5 和絃(CHORTD) …………………………………………19
3.2 相似度計算……………………………………………………………20
第四章 實驗找出各音樂使人產生的情緒變化情形以及實驗分析………24
4.1實驗方式及系統實做…………………………………………………24
4.1.1進行實驗找出能引導情緒至正向的音樂……………………24
4.1.2 實驗結果………………………………………………………28
4.2 實驗分析………………………………………………………………28
4.2.1 兩種音樂表示方法的比較……………………………………28
4.2.2 音樂分群的效果………………………………………………32
4.2.3 各類型音樂影響情緒的情形…………………………………33
第五章 結論及未來研究……………………………………………………37
參考資料………………………………………………………………………40


圖目錄
圖2.1:[5]的系統架構。………………………………………………………………………5
圖2.2:二維情緒分類法-2DES。二維情緒分類法。………………………………………7
圖2.3:Thayer’s mood model…………………………………………………………………8
圖2.4:[12]所提出的階層式音樂架構。………………………………………………………9
圖2.5:mASR與vASR對於情緒的關聯。……………………………………………………11
圖2.6:mASR、vASR對於情緒的分類。……………………………………………………12
圖2.7:音樂特徵與情緒的對應。……………………………………………………………13
圖2.8:[1]的音樂資料表示法。………………………………………………………………14
圖3.1:Longest Common Subsequence(L.C.S.)演算法。………………………………………21
圖3.2:Longest Common Subsequence(L.C.S.)演算法範例。…………………………………21
圖3.3:相似度矩陣示意圖。…………………………………………………………………23
圖4.1:填寫前測量表。………………………………………………………………………25
圖4.2:撥放音樂。……………………………………………………………………………26
圖4.3:填寫後測量表。………………………………………………………………………27
圖4.4:δ = 60%時,兩種方法的成功率比較。……………………………………………30
圖4.5:δ = 70%時,兩種方法的成功率比較。……………………………………………31
圖4.6:δ = 80%時,兩種方法的成功率比較。……………………………………………31
圖4.7:古典音樂引導正向情緒。……………………………………………………………34
圖4.8:爵士樂引導正向情緒。………………………………………………………………34
圖4.9:流行音樂引導正向情緒。……………………………………………………………34
圖4.10:各類音樂(不包含以上三種音樂;有傳統音樂,拉丁音樂,童謠,行進曲等)。…34

表目錄
表2.1:Juslin’s model of mood.……………………………………………………………10
表4.1:不同threshold時,正向音樂之間相似歌曲的數目。………………………………32
表4.2:正向音樂在不同門檻值下,與負向音樂相似的歌曲數目。………………………33
表4.3:引導情緒至正向且降低arousal的音樂,與各類型負項音樂的相關度。…………35
zh_TW
dc.format.extent 44343 bytes-
dc.format.extent 67196 bytes-
dc.format.extent 64824 bytes-
dc.format.extent 70723 bytes-
dc.format.extent 101321 bytes-
dc.format.extent 430077 bytes-
dc.format.extent 112373 bytes-
dc.format.extent 1058461 bytes-
dc.format.extent 87586 bytes-
dc.format.extent 59024 bytes-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.format.mimetype application/pdf-
dc.language.iso en_US-
dc.source.uri (資料來源) http://thesis.lib.nccu.edu.tw/record/#G0094753019en_US
dc.subject (關鍵詞) 音樂zh_TW
dc.subject (關鍵詞) 情緒zh_TW
dc.title (題名) 分析音樂特徵尋找將情緒引導至正向之音樂zh_TW
dc.title (題名) Finding Music for Leading Personal Emotions to Positive Valence by Analyzing Music Featuresen_US
dc.type (資料類型) thesisen
dc.relation.reference (參考文獻) [1] 李侃儒, “個人化情緒/情境音樂檢索系統, ” 第五屆數位典藏技術研討會, 2006.zh_TW
dc.relation.reference (參考文獻) [2] 陳美如主譯, 篠田知璋、加藤美知子主編, “標準音樂治療入門, ” 台北市:五南圖書公司, 2005.zh_TW
dc.relation.reference (參考文獻) [3] 謝文傑, “音樂治療對心理與身心健康的影響,” http://www.psychpark.org/psy/music.asp , 2002.zh_TW
dc.relation.reference (參考文獻) [4] 謝俊逢, “音樂療法-理論與方法, ” 台北市:大陸書局, 2003。zh_TW
dc.relation.reference (參考文獻) [5] H.C. Chen and A. L. P. Chen, “A music recommendation system based on music data grouping and user interests, ” ACM Conference on Information and Knowledge Management, 2001.zh_TW
dc.relation.reference (參考文獻) [6] T.H. Cormen, C.E. Leiserson, R.L. Rivest, and C. Stein, “Introduction to Algorithms 2nd ed. page 350-356,” Massachusetts Institute of Technology, 2003.zh_TW
dc.relation.reference (參考文獻) [7] A. Gabrielsson and P.N. Juslin, “Emotional Expression in Music, ” in Handbook of Affective Sciences, Oxford University Press, 2003.zh_TW
dc.relation.reference (參考文獻) [8] A. Gabrielsson and E. Lindstrom, “The Influence of Musical Structure on Emotional Expression, “ in Music and Emotion, theory and research, 2001.zh_TW
dc.relation.reference (參考文獻) [9] J.S.R. Jang and H.R. Lee, “Hierarchical Filtering Method for Content-based Music Retrieval via Acoustic Input,” ACM Conference on Multimedia, 2001.zh_TW
dc.relation.reference (參考文獻) [10] P.N. Juslin, “Cue utilization in communication of emotion in music performance: Relating performance to perception”, J.Experimental Psychology, 26, pp. 1797-1813, 2000.zh_TW
dc.relation.reference (參考文獻) [11] P.N. Juslin and P. Laukka, “Communication of Emotions in Vocal Expression and Music Performance: Different Channels, Same Code?, ” Psychological Bulletin, 129 (5), pp. 770-814, 2003.zh_TW
dc.relation.reference (參考文獻) [12] D. Liu, L. Lu, and H.J. Zhang, “Automatic Mood Detection from Acoustic Music Data, ” International Symposium on Music Information Retrieval, 2003.zh_TW
dc.relation.reference (參考文獻) [13] S.R. Livingstone and A.R. Brown, “Dynamic Response: Real-Time Adaptation for Music Emotion, ” Proceedings of the second Australasian conference on Interactive entertainment, 2005.zh_TW
dc.relation.reference (參考文獻) [14] N.C. Maddage, C.S. Xu, M.S. Kankanhalli, and X. Shao, “Content-based Music Structure Analysis with Applications to Music Semantics Understanding,” ACM Conference on Multimedia, 2004.zh_TW
dc.relation.reference (參考文獻) [15] J.D. Morris and M.A. Boone, “The effects of music on emotional response, Brand Attitude, and Purchase Intent In an Emotional Advertising Condition, ” Attitude Self-Assessment Manikin, adsam.com, 1998.zh_TW
dc.relation.reference (參考文獻) [16] G. Nagler, “Guess chords from midi binaries,” http://www.gnmidi.com/utils/midchord.zip , 1998/1999.zh_TW
dc.relation.reference (參考文獻) [17] National Institute of Advanced Industrial Science and Technology (AIST), “RWC Music Database,” http://staff.aist.go.jp/m.goto/RWC-MDB/ , 2002.zh_TW
dc.relation.reference (參考文獻) [18] N. Oliver and F. Flores-Mangas, “MPTrain_a mobile music and physiology-based personal trainer, ” ACM Human-Computer Interaction with Mobile devices and services, 2006.zh_TW
dc.relation.reference (參考文獻) [19] B. Pardo and W. Birmingham, “Chordal analysis of tonal music, ” Technical Report CSETR43901, Electrical Engineering and Computer Science Department, University of Michigan, 2001.zh_TW
dc.relation.reference (參考文獻) [20] J. Pickens and T. Crawford, “Harmonic Models for Polyphonic Music Retrieval, ” ACM Conference on Information and Knowledge Management, 2002.zh_TW
dc.relation.reference (參考文獻) [21] J. Pickens and C. Iliopoulos, “Markov Random Fields and Maximum Entropy Modeling for Music, ” International Symposium on Music Information Retrieval, 2005.zh_TW
dc.relation.reference (參考文獻) [22] J. Russell, “A circumplex model of affect,” Journal of Personality and Social Psychology, 39, pp. 1161-1178, 1980.zh_TW
dc.relation.reference (參考文獻) [23] E. Schubert, “Measurement and Time Series Analysis of Emotion in Music,” University of New South Wales, 1999.zh_TW
dc.relation.reference (參考文獻) [24] S. Stanley and L. Alison, “The Cambridge Music Guide,” Cambridge University Press, 1990.zh_TW
dc.relation.reference (參考文獻) [25] M. Steinbach, G.. Karypis, and V. Kumar, “A Comparison of Document Clustering Techniques,” Knowledge Discovery in Data Workshop on Text Mining, 2000.zh_TW
dc.relation.reference (參考文獻) [26] I.H. Witten and E. Frank, “Data Mining: Practical Machine Learning Tools and Technioques,” 2nd ed., Morgan Kaufmann, 2005.zh_TW
dc.relation.reference (參考文獻) [27] Y.W. Zhu and M.S. Kankanhalli, “Music Scale Modeling for Melody Matching,” ACM Conference on Multimedia, 2003.zh_TW