Music emotion recognition: A state of the art review
This paper surveys the state of the art in automatic emotion recognition in music. Music is oftentimes referred to as a "language of emotion" , and it is natural for us to categorize music in terms of its emotional associations. Myriad features, such as harmony, timbre, interpretation, and lyrics affect emotion, and the mood of a piece may also change over its duration. But in developing automated systems to organize music in terms of emotional content, we are faced with a problem that oftentimes lacks a welldefined answer; there may be considerable disagreement regarding the perception and interpretation of the emotions of a song or ambiguity within the piece itself. When compared to other music information retrieval tasks (e.g., genre identification), the identification of musical mood is still in its early stages, though it has received increasing attention in recent years. In this paper we explore a wide range of research in music emotion recognition, particularly focusing on methods that use contextual text information (e.g., websites, tags, and lyrics) and content-based approaches, as well as systems combining multiple feature domains. © 2010 International Society for Music Information Retrieval.
Proceedings of the 11th International Society for Music Information Retrieval Conference, ISMIR 2010
Kim, Youngmoo E.; Schmidt, Erik M.; Migneco, Raymond; Morton, Brandon G.; Richardson, Patrick; Scott, Jeffrey; Speck, Jacquelin A.; and Turnbull, Douglas, "Music emotion recognition: A state of the art review" (2010). Faculty Articles Indexed in Scopus. 1434.