Music representation learning based on editorial metadata from Discogs

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Alonso-Jiménez, Pablo
  • dc.contributor.author Serra, Xavier
  • dc.contributor.author Bogdanov, Dmitry
  • dc.date.accessioned 2023-04-11T06:43:04Z
  • dc.date.available 2023-04-11T06:43:04Z
  • dc.date.issued 2022
  • dc.description Comunicació presentada a 23nd International Society for Music Information Retrieval Conference (ISMIR 2022), celebrat del 4 al 8 de desembre de 2022 a Bangalore, Índia.
  • dc.description.abstract This paper revisits the idea of music representation learning supervised by editorial metadata, contributing to the state of the art in two ways. First, we exploit the public editorial metadata available on Discogs, an extensive community-maintained music database containing information about artists, releases, and record labels. Second, we use a contrastive learning setup based on COLA, different from previous systems based on triplet loss. We train models targeting several associations derived from the metadata and experiment with stacked combinations of learned representations, evaluating them on standard music classification tasks. Additionally, we consider learning all the associations jointly in a multi-task setup. We show that it is possible to improve the performance of current self-supervised models by using inexpensive metadata commonly available in music collections, producing representations comparable to those learned on classification setups. We find that the resulting representations based on editorial metadata outperform a system trained with music style tags available in the same large-scale dataset, which motivates further research using this type of supervision. Additionally, we give insights on how to preprocess Discogs metadata to build training objectives and provide public pre-trained models.
  • dc.description.sponsorship This research was carried out under the project Musical AI - PID2019-111403GB-I00/AEI/10.13039/501100011033, funded by the Spanish Ministerio de Ciencia e Innovación and the Agencia Estatal de Investigación.
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Alonso-Jiménez P, Serra X, Bogdanov D. Music representation learning based on editorial metadata from Discogs. In: Rao P, Murthy H, Srinivasamurthy A, Bittner R, Caro Repetto R, Goto M, Serra X, Miron M, editors. Proceedings of the 23nd International Society for Music Information Retrieval Conference (ISMIR 2022); 2022 Dec 4-8; Bengaluru, India. [Canada]: International Society for Music Information Retrieval; 2022. p. 825-33. DOI: 10.5281/zenodo.7316790
  • dc.identifier.doi http://dx.doi.org/10.5281/zenodo.7316790
  • dc.identifier.isbn 978-1-7327299-2-6
  • dc.identifier.uri http://hdl.handle.net/10230/56444
  • dc.language.iso eng
  • dc.publisher International Society for Music Information Retrieval (ISMIR)
  • dc.relation.ispartof Rao P, Murthy H, Srinivasamurthy A, Bittner R, Caro Repetto R, Goto M, Serra X, Miron M, editors. Proceedings of the 23nd International Society for Music Information Retrieval Conference (ISMIR 2022); 2022 Dec 4-8; Bengaluru, India. [Canada]: International Society for Music Information Retrieval; 2022. p. 825-33.
  • dc.relation.isreferencedby https://multimediaeval.github.io/2021-Emotion-and-Theme-Recognition-in-Music-Task/
  • dc.relation.projectID info:eu-repo/grantAgreement/ES/2PE/PID2019-111403GB-I00
  • dc.rights © P. Alonso, X. Serra, and D. Bogdanov. Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.rights.uri http://creativecommons.org/licenses/by/4.0/
  • dc.subject.other Metadades
  • dc.subject.other Música
  • dc.title Music representation learning based on editorial metadata from Discogs
  • dc.type info:eu-repo/semantics/conferenceObject
  • dc.type.version info:eu-repo/semantics/publishedVersion