Paper Detail

MGPHot: A Dataset of Musicological Annotations for Popular Music (1958-2022)

Paper ID: https://openalex.org/W44108244642025Citations: 0core

Source

Transactions of the International Society for Music Information Retrieval

Slug: tismir

Abstract

The Music Genome Project® is an extensive music annotation effort spanning two decades, during which a team of musicologists has been annotating a dataset of millions of songs with hundreds of musicological attributes. A derivative of this effort is presented in this paper. We are releasing MGPHot, a dataset of more than 21,000 songs that have appeared at least once in the Billboard Hot 100 charts from 1958 until 2022, annotated with 58 musical attributes that are grouped into seven different categories: rhythm, compositional focus, harmony, instrumentation, sonority, vocals, and lyrics. Given the unprecedented quality and breadth of annotation, as well as the size of the released corpus, the MGPHot dataset opens up a myriad of possibilities for musicology and music information retrieval, such as auto‑tagging, chart prediction, and music recommendation. To illustrate the breadth and depth of the dataset, we conduct a study on the evolution of popular music over the past 65 years, focusing on when and how changes occurred. While previous research has approached this topic through audio processing or historical methods, comprehensive musicological analyses at scale have been lacking, which this new dataset facilitates. In our study, we identify distinct eras and pivotal moments, reaffirming and broadening previous research on stylistic revolutions, and investigating the musical attributes that drive these changes. By analyzing the 58 musical attributes, we identify three major revolutions (1964, 1983, and 2016) and two minor ones (1991 and 2007), each propelled by specific musical attributes.

Authors

  • Sergio Oramas
  • Fabien Gouyon
  • S. D. Hogan
  • Camilo Landau
  • Andreas F. Ehmann

Topics

Music and Audio ProcessingMusic Technology and Sound StudiesNeuroscience and Music Perception

Similar papers

Next explainability step

This page now serves real metadata from Postgres. Next, attach ranking run context and per-signal contributions.