题名

流行音樂風格與情感反應研究

并列篇名

A study on popular music styles and affective responses

作者

林佩儒(Pei-Ju Lin)

关键词

流行音樂風格 ; 音樂特徵 ; 音樂情感反應 ; 因素分析 ; 支援向量機 ; Popular Music Style ; Feature ; Affective Response ; Factor Analysis ; Support Vector Machine

期刊名称

商業設計學報

卷期/出版年月

26期(2023 / 03 / 01)

页次

1 - 15

内容语文

繁體中文;英文

中文摘要

隨流行音樂產業的蓬勃發展,音樂設計展演使用科學研究方法,可以更有效的製作出符合演出情境效果的演唱會。本研究透過八大曲風的音樂樣本蒐集、情感反應尺度篩選、音樂訊號分析、支援向量機與音樂特徵分析等研究步驟,可以了解音樂構成的元素,以及其展現於樂曲當中的特色與情感反應的關聯性,建構適合的流行音樂演唱會演出曲目,可以讓觀眾融入並沉浸於特定的演唱會音樂風格與情境當中。本研究首先篩選出代表性流行音樂風格曲目,而後進行音樂樣本收集與感性辭彙篩選,透過因素分析篩選分類標籤及使用支援向量機建立情感分類模型。研究結果顯示,情感反應尺度結果歸類出三組分類標籤,分別是「特殊性」、「感受性」及「變化性」,並針對這三組分類標籤分別進行音訊分析,可以看出各分類標籤與音量、音色這兩項特徵之間的關係。最後透過支援向量機所獲得的最佳訓練模型平均準確率為93.3%,證明了本研究結果的可行性,分析結果也提供具體而明確的音樂設計建議,並作為未來延續音樂與情感反應研究的重要參考資料。

英文摘要

As the popular music industry thrives, scientific methods have become a more effective way of designing music concerts with befitting performance effects. This study collected samples of eight major music styles, selected affective response dimensions, analyzed music signals, employed a support vector machine, and analyzed musical characteristics to understand the composing elements of music and the connection between the features they display in music and affective responses. Constructing a suitable popular music concert repertoire can immerse the audience in certain concert music styles and scenarios. We first chose songs representative of various popular music styles and then collected music samples and selected descriptive adjectives. Using factor analysis, we selected category labels and used a support vector machine to establish an affective categorization model. The study results revealed three category labels for the affective response dimensions: uniqueness, feeling, and variation. We conducted signal analysis based on these three category labels to derive the relationships between these category labels and the features volume and timbre. The mean accuracy rate of the optimized and trained model obtained using a support vector machine was 93.3%, thereby demonstrating the feasibility of the study results. The analysis results also provide clear and specific suggestions for music design and serve as important reference for future research on music and affective responses.

主题分类 人文學 > 藝術
社會科學 > 傳播學
参考文献
  1. Adachi, M.,Trehub, S. E.(1998).Children's expression of emotion in song.Psychology of Music,26(2),133-153.
  2. Behrens, G. A.,Green, S.(1993).The ability toidentify emotional content of solo improvisations performed vocally and on three different instruments.Psychology of Music,21,20-33.
  3. Desmet, P.M.A.(2003).Measuring motion: Development and application of an instrument to measure emotional responses to products.Funology: From usability to enjoyment,Dordrecht, Netherlands:
  4. Gabrielsson, A.,Juslin, P. N.(1996).Emotional expression in music performance: Between the performer's intention and the listener's experience.Psychology of Music,24,68-91.
  5. Hunt, M.,Lennig, M.,Mermelstein, P.(1980).Experiments in syllable-based recognition ofcontinuous speech.Proceedings from ICASSP ’80: Acoustics, Speech, and Signal Processing, IEEE International Conferenceon,Denver, Colorado/USA:
  6. Izard, C. E.(1977).Human emotions.New York, NY:Plenum Press.
  7. Juslin, P. N.(1997).Perceived emotional expression in synthesized performances of a short melody: Capturing the listener's judgment policy.Musicae Scientiae,1,225-256.
  8. Juslin, P. N.(2000).Cue utilization in communication of emotion in music performance: Relating performance to perception.Journal of Experimental Psychology: Human Perception and Performance,26,1797-1813.
  9. Juslin, P. N.,Laukka, P.(2001).Impact of intended emotion intensity on cue utilization and decoding accuracy in vocal expression of emotion.Emotion,1,381-412.
  10. Juslin, P. N.,Madison, G.(1999).The role of timing patterns in recognition of emotional expression from musical performance.Music Perception,17,197-221.
  11. Kim, J. O.,Mueller, C. W.(1978).Factor Analysis: Statistical Methods and Practical Issues.Newbury Park, CA, USA:Sage Publication.
  12. Lartillot, O,Toiviainen, P.,Eerola, T.(2008).A Matlab Toolbox for Music Information Retrieval.Finland:University of Jyvaskyla.
  13. Legaspi, R.,Hashimoto, Y.,Moriyama, K.,Kurihara, S.,Numao, M.(2007).Music compositional intelligence with an affective flavor.Proceedings from IUI’2007,Honolulu, HI:
  14. Liu, D.,Lu, L.,Zhang, H.-J.(2003).Automatic mood detection from acoustic music data.IEEE Transactions on Audio, Speech, and Language Processing,14(1),5-18.
  15. Livingstone, S. R.,Brown, A. R.(2005).Dynamic Response: Real-Time Adaptation for Music Emotion.Proceedings from Second Australasian conference on Interactive Entertainment,Sydney: Australia:
  16. Maeda, Y.,Kajihara, Y.(2009).Automatic generation method of twelve tone row for musical composition used genetic algorithm.Proceedings from FUZZ-IEEE,eju Island, South Korea:
  17. Mehrabian, A.(1997).Comparison of the PAD and PANAS as models for describing emotions and differentiating anxiety from depression.Journal of Psychopathology and Behavioral Assessment,19(4),331-357.
  18. Mehrabian, A.(1995).Framework for a comprehensive description and measurement of emotional states.Genetic, Social, and General Psychology Monographs,121(3),339-361.
  19. Mehrabian, A.(1998).Correlations of the PAD emotion scales with self-reported satisfaction in marriage and work.Genetic, Social, and General Psychology Monographs,124(3),311-334.
  20. Mehrabian, A.,de Wetter, R.(1987).Experimental test on an emotion-based approach fitting brand names of products.Journal of Applied Psychology,72(1),125-130.
  21. Mehrabian, A.,Russell, J.(1974).An approach to environmental psychology.Cambridge, MA:MIT Press.
  22. Nunnally, J. C.(1967).Psychometric Theory.New York, NY:McGraw-Hill.
  23. Plutchik, R.(1980).emotion: A psychoevolutionary synthesis.New York, NY:Harper and Row.
  24. Russell, J. A.(2003).Core affect and the psychological construction of emotion.Psychological Review,110(1),145-172.
  25. Russell, J. A.(1980).A Circumplex Model of Affect.Journal of Personality and Social Psychology,39,1161-1178.
  26. Russell, J. A.,Carroll, J. M.(1999).On the bipolarity of positive and negative affect.Psychological Bulletin,125(1),3-30.
  27. Russell, J. A.,Mehrabian, A.(1976).Evidence for a three‐factor theory of emotions.Journal of Research in Personality,11,273-294.
  28. Tzanetakis, G.,Cook, P.(2002).Musical genre classification of audio signals.IEEE Transactions on Speech and Audio Processing,10(5)
  29. Unehara, M.,Onisawa, T.(2003).Music composition system with human evaluation as human centered system.Soft Computing,7,167-178.
  30. Unehara, M.,Onisawa, T.(2005).Music composition by interaction between human and computer.New Generation Computing,23,181-191.
  31. Watson, D.,Clark, L. A.,Tellegen, A.(1988).Development and Validation of Brief Measures of Positive and Negative Affect: The PANAS scales.Journal of Personality and Social Psychology,54,1063-1070.
  32. Yang, D.,Lee, W.(2004).Disambiguating music emotion using software agents.Proceedings from 5th International Conference on Music Information Retrieval,Spain:
  33. 文化內容策進院(2020).2020 年台灣文化內容產業調查報告 III,流行音樂產業.
  34. 林明穎(2009)。台北市,國立台灣師範大學教育心理與輔導學系。
  35. 林俊男(2001)。雲林縣,國立雲林科技大學工業設計研究所。
  36. 莊雅量(2001)。新竹市,國立交通大學應用藝術研究所。
  37. 黃靜芳,吳舜文(2007)。大學生音樂選曲與情緒反應之相關研究。國際藝術教育學刊,54-70。