Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Knees, Peter
  • dc.contributor.author Faraldo Pérez, Ángel
  • dc.contributor.author Herrera Boyer, Perfecto, 1964-
  • dc.contributor.author Vogl, Richard
  • dc.contributor.author Böck, Sebastian
  • dc.contributor.author Hörschläger, Florian
  • dc.contributor.author Le Goff, Mickael
  • dc.date.accessioned 2020-08-26T10:51:36Z
  • dc.date.available 2020-08-26T10:51:36Z
  • dc.date.issued 2015
  • dc.description Comunicació presentada a: the 16th International Society for Music Information Retrieval Conference (ISMIR), celebrat del 26 al 30 d'octubre de 2015 a Màlaga.ca
  • dc.description.abstract We present two new data sets for automatic evaluation of tempo estimation and key detection algorithms. In contrast to existing collections, both released data sets focus on electronic dance music (EDM). The data sets have been automatically created from user feedback and annotations extracted from web sources. More precisely, we utilize user corrections submitted to an online forum to report wrong tempo and key annotations on the Beatport website. Beatport is a digital record store targeted at DJs and focusing on EDM genres. For all annotated tracks in the data sets, samples of at least one-minute-length can be freely downloaded. For key detection, further ground truth is extracted from expert annotations manually assigned to Beatport tracks for benchmarking purposes. The set for tempo estimation comprises 664 tracks and the set for key detection 604 tracks. We detail the creation process of both data sets and perform extensive benchmarks using state-of-theart algorithms from both academic research and commercial products.en
  • dc.description.sponsorship This work is supported by the European Union Seventh Framework Programme FP7 / 2007-2013 through the GiantSteps project (grant agreement no. 610591).
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Knees P, Faraldo A, Herrera P, Vogl R, Böck S, Hörschläger F, Le Goff M. Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections. In: Proceedings of the 16th International Society for Music Information Retrieval Conference (ISMIR); 2015 Oct 26-30; Málaga, Spain. [Málaga]: International Society for Music Information Retrieval, 2015. p. 364-70.
  • dc.identifier.uri http://hdl.handle.net/10230/45236
  • dc.language.iso eng
  • dc.publisher International Society for Music Information Retrieval (ISMIR)
  • dc.relation.ispartof Proceedings of the 16th International Society for Music Information Retrieval Conference (ISMIR); 2015 Oct 26-30; Málaga, Spain. [Málaga]: International Society for Music Information Retrieval, 2015. p. 364-70.
  • dc.relation.isreferencedby http://www.cp.jku.at/datasets/giantsteps/#data
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/FP7/610591
  • dc.rights Licensed under a Creative Commons Attribution 4.0 International License (CC BY 4.0). Attribution: Knees P, Faraldo A, Herrera P, Vogl R, Böck S, Hörschläger F, Le Goff M. Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections, International Society for Music Information Retrieval Conference, 2015 Oct 26-30; Málaga, Spain.
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.rights.uri https://creativecommons.org/licenses/by/4.0/
  • dc.title Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections
  • dc.type info:eu-repo/semantics/conferenceObject
  • dc.type.version info:eu-repo/semantics/publishedVersion