Music representation learning based on editorial metadata from Discogs

dc.contributor.authorAlonso-Jiménez, Pablo
dc.contributor.authorSerra, Xavier
dc.contributor.authorBogdanov, Dmitry
dc.date.accessioned2023-04-11T06:43:04Z
dc.date.available2023-04-11T06:43:04Z
dc.date.issued2022
dc.descriptionComunicaciĂł presentada a 23nd International Society for Music Information Retrieval Conference (ISMIR 2022), celebrat del 4 al 8 de desembre de 2022 a Bangalore, ĂŤndia.
dc.description.abstractThis paper revisits the idea of music representation learning supervised by editorial metadata, contributing to the state of the art in two ways. First, we exploit the public editorial metadata available on Discogs, an extensive community-maintained music database containing information about artists, releases, and record labels. Second, we use a contrastive learning setup based on COLA, different from previous systems based on triplet loss. We train models targeting several associations derived from the metadata and experiment with stacked combinations of learned representations, evaluating them on standard music classification tasks. Additionally, we consider learning all the associations jointly in a multi-task setup. We show that it is possible to improve the performance of current self-supervised models by using inexpensive metadata commonly available in music collections, producing representations comparable to those learned on classification setups. We find that the resulting representations based on editorial metadata outperform a system trained with music style tags available in the same large-scale dataset, which motivates further research using this type of supervision. Additionally, we give insights on how to preprocess Discogs metadata to build training objectives and provide public pre-trained models.
dc.description.sponsorshipThis research was carried out under the project Musical AI - PID2019-111403GB-I00/AEI/10.13039/501100011033, funded by the Spanish Ministerio de Ciencia e InnovaciĂłn and the Agencia Estatal de InvestigaciĂłn.
dc.format.mimetypeapplication/pdf
dc.identifier.citationAlonso-Jiménez P, Serra X, Bogdanov D. Music representation learning based on editorial metadata from Discogs. In: Rao P, Murthy H, Srinivasamurthy A, Bittner R, Caro Repetto R, Goto M, Serra X, Miron M, editors. Proceedings of the 23nd International Society for Music Information Retrieval Conference (ISMIR 2022); 2022 Dec 4-8; Bengaluru, India. [Canada]: International Society for Music Information Retrieval; 2022. p. 825-33. DOI: 10.5281/zenodo.7316790
dc.identifier.doihttp://dx.doi.org/10.5281/zenodo.7316790
dc.identifier.isbn978-1-7327299-2-6
dc.identifier.urihttp://hdl.handle.net/10230/56444
dc.language.isoeng
dc.publisherInternational Society for Music Information Retrieval (ISMIR)
dc.relation.ispartofRao P, Murthy H, Srinivasamurthy A, Bittner R, Caro Repetto R, Goto M, Serra X, Miron M, editors. Proceedings of the 23nd International Society for Music Information Retrieval Conference (ISMIR 2022); 2022 Dec 4-8; Bengaluru, India. [Canada]: International Society for Music Information Retrieval; 2022. p. 825-33.
dc.relation.isreferencedbyhttps://multimediaeval.github.io/2021-Emotion-and-Theme-Recognition-in-Music-Task/
dc.relation.projectIDinfo:eu-repo/grantAgreement/ES/2PE/PID2019-111403GB-I00
dc.rights© P. Alonso, X. Serra, and D. Bogdanov. Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0).
dc.rights.accessRightsinfo:eu-repo/semantics/openAccess
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/
dc.subject.otherMetadades
dc.subject.otherMĂşsica
dc.titleMusic representation learning based on editorial metadata from Discogs
dc.typeinfo:eu-repo/semantics/conferenceObject
dc.type.versioninfo:eu-repo/semantics/publishedVersion

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Serra_Pro_Musi.pdf
Size:
181.27 KB
Format:
Adobe Portable Document Format

License

Rights