Exploring mood metadata: Relationships with genre, artist and usage metadata

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

There is a growing interest in developing and then evaluating Music Information Retrieval (MIR) systems that can provide automated access to the mood dimension of music. Mood as a music access feature, however, is not well understood in that the terms used to describe it are not standardized and their application can be highly idiosyncratic. To better understand how we might develop methods for comprehensively developing and formally evaluating useful automated mood access techniques, we explore the relationships that mood has with genre, artist and usage metadata. Statistical analyses of term interactions across three metadata collections (AllMusicGuide.com, epinions.com and Last.fm) reveal important consistencies within the genre-mood and artist-mood relationships. These consistencies lead us to recommend a cluster-based approach that overcomes specific term-related problems by creating a relatively small set of data-derived "mood spaces" that could form the ground-truth for a proposed MIREX "Automated Mood Classification" task.

Original languageEnglish (US)
Title of host publicationProceedings of the 8th International Conference on Music Information Retrieval, ISMIR 2007
Pages67-72
Number of pages6
StatePublished - 2007
Event8th International Conference on Music Information Retrieval, ISMIR 2007 - Vienna, Austria
Duration: Sep 23 2007Sep 27 2007

Publication series

NameProceedings of the 8th International Conference on Music Information Retrieval, ISMIR 2007

Other

Other8th International Conference on Music Information Retrieval, ISMIR 2007
Country/TerritoryAustria
CityVienna
Period9/23/079/27/07

ASJC Scopus subject areas

  • Music
  • Information Systems

Fingerprint

Dive into the research topics of 'Exploring mood metadata: Relationships with genre, artist and usage metadata'. Together they form a unique fingerprint.

Cite this