Multi-feature beat tracking

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Zapata González, José Ricardo
  • dc.contributor.author Davies, Matthew E. P.
  • dc.contributor.author Gómez Gutiérrez, Emilia, 1975-
  • dc.date.accessioned 2023-04-27T06:16:12Z
  • dc.date.available 2023-04-27T06:16:12Z
  • dc.date.issued 2014
  • dc.description.abstract A recent trend in the field of beat tracking for musical audio signals has been to explore techniques for measuring the level of agreement and disagreement between a committee of beat tracking algorithms. By using beat tracking evaluation methods to compare all pairwise combinations of beat tracker outputs, it has been shown that selecting the beat tracker which most agrees with the remainder of the committee, on a song-by-song basis, leads to improved performance which surpasses the accuracy of any individual beat tracker used on its own. In this paper we extend this idea towards presenting a single, standalone beat tracking solution which can exploit the benefit of mutual agreement without the need to run multiple separate beat tracking algorithms. In contrast to existing work, we re-cast the problem as one of selecting between the beat outputs resulting from a single beat tracking model with multiple, diverse input features. Through extended evaluation on a large annotated database, we show that our multi-feature beat tracker can outperform the state of the art, and thereby demonstrate that there is sufficient diversity in input features for beat tracking, without the need for multiple tracking models.
  • dc.description.sponsorship This work was supported in part by the R+I+D Ph.D. scholarship of Universidad Pontificia Bolivariana and Colciencias (Colombia), the projects of the Spanish Ministry of Science and Innovation DRIMS (MICINN-TIN2009-14247-C02-01), SIGMUS (MINECO-TIN2012-36650) and Mires (EC-7PM-MIReS), and in part by the Media Arts and Technologies project (MAT), NORTE-07-0124-FEDER-000061, financed by the North Portugal Regional Operational Programme (ON.2 O Novo Norte), under the National Strategic Reference Framework (NSRF), through the European Regional Development Fund (ERDF) by national funds, through the Portuguese funding agency, Fundação para a Ciência e a Tecnologia (FCT), the FCT post-doctoral grant (SFRH/BPD/88722/2012), the PHENICX (EU: FP7 2007/2013 grant agreement n 601166) and COFLA (Junta de Andaluca P09-TIC-4840) projects.
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Zapata JR, Davies MEP, Gómez E. Multi-feature beat tracking. IEEE Trans Audio Speech Lang Process. 2014;22(4):816-25. DOI: 10.1109/TASLP.2014.2305252
  • dc.identifier.doi http://dx.doi.org/10.1109/TASLP.2014.2305252
  • dc.identifier.issn 1558-7916
  • dc.identifier.uri http://hdl.handle.net/10230/56586
  • dc.language.iso eng
  • dc.publisher Institute of Electrical and Electronics Engineers (IEEE)
  • dc.relation.ispartof IEEE/ACM Transactions on Audio, Speech, and Language Processing. 2014;22(4):816-25.
  • dc.relation.projectID info:eu-repo/grantAgreement/ES/3PN/TIN2009-14247-C02-01
  • dc.relation.projectID info:eu-repo/grantAgreement/ES/3PN/TIN2012-36650
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/FP7/601166
  • dc.rights © 2014 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. http://dx.doi.org/10.1109/TASLP.2014.2305252
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.subject.keyword beat tracking
  • dc.subject.keyword evaluation
  • dc.subject.keyword music information retrieval
  • dc.subject.keyword music signal processing
  • dc.title Multi-feature beat tracking
  • dc.type info:eu-repo/semantics/article
  • dc.type.version info:eu-repo/semantics/acceptedVersion