Automatic piano fingering from partially annotated scores using autoregressive neural networks
Mostra el registre complet Registre parcial de l'ítem
- dc.contributor.author Ramoneda, Pedro
- dc.contributor.author Jeong, Dasaem
- dc.contributor.author Nakamura, Eita
- dc.contributor.author Serra, Xavier
- dc.contributor.author Miron, Marius
- dc.date.accessioned 2022-10-07T06:26:36Z
- dc.date.available 2022-10-07T06:26:36Z
- dc.date.issued 2022
- dc.description Comunicació presentada a: 30th ACM International Conference on Multimedia (MM'22), celebrat del 10 al 14 d'octubre de 2022 a Lisboa, Portugal
- dc.description.abstract Piano fingering is a creative and highly individualised task acquired by musicians progressively in their first music education years. Pianists must learn to choose the order of fingers to play the piano keys because scores do not have engraved finger and hand movements as other technique elements. Numerous research efforts have been conducted for automatic piano fingering based on a previous dataset composed of 150 score excerpts fully annotated by multiple expert annotators. However, most piano sheets include partial annotations for problematic finger and hand movements. We introduce a novel dataset for the task, the ThumbSet dataset, containing 2523 pieces with partial and noisy annotations of piano fingering crowdsourced from non-expert annotators. As part of our methodology, we propose two autoregressive neural networks with beam search decoding for modelling automatic piano fingering as a sequence-to-sequence learning problem, considering the correlation between output finger labels. We design the first model with the exact pitch representation of previous proposals. The second model uses graph neural networks to more effectively represent polyphony, whose treatment has been a common issue across previous studies. Finally, we finetune the models on the existing expert annotations dataset. The evaluation shows that (1) we are able to achieve high performance when training on the ThumbSet dataset and that (2) the proposed models outperform the state-of-the-art hidden Markov models and recurrent neural network baselines. Code, dataset, models, and results are made available to enhance the task reproducibility, including a new framework for evaluation
- dc.description.sponsorship This work is supported in part by the project Musical AI - PID2019- 111403GB-I00/AEI/10.13039/501100011033 funded by the Spanish Ministerio de Ciencia, Innovacion y Universidades (MCIU) and the Agencia Estatal de Investigacion (AEI), Sogang University Research Grant of 202110035.01 and JSPS KAKENHI Nos. 21K12187 and 22H03661.
- dc.format.mimetype application/pdf
- dc.identifier.citation Ramoneda P, Jeong D, Nakamura E, Serra X, Miron M. Automatic piano fingering from partially annotated scores using autoregressive neural networks. In: 30th ACM International Conference on Multimedia (MM'22); 2022 Oct 10-14; Lisboa, Portugal. New York: ACM; 2022. 9 p. DOI: 10.1145/3503161
- dc.identifier.doi https://doi.org/10.1145/3503161.3548372
- dc.identifier.uri http://hdl.handle.net/10230/54308
- dc.language.iso eng
- dc.publisher ACM Association for Computer Machinery
- dc.relation.ispartof 30th ACM International Conference on Multimedia (MM'22); 2022 Oct 10-14; Lisboa, Portugal. New York: ACM; 2022. 9 p.
- dc.relation.projectID info:eu-repo/grantAgreement/ES/2PE/PID2019-111
- dc.rights © 2022 Association for Computing Machinery
- dc.rights.accessRights info:eu-repo/semantics/openAccess
- dc.subject.keyword Dataset
- dc.subject.keyword Neural Networks
- dc.subject.keyword Automatic Piano Fingering
- dc.subject.keyword Music Education Technologies
- dc.subject.keyword Music Information Retrieval
- dc.title Automatic piano fingering from partially annotated scores using autoregressive neural networks
- dc.type info:eu-repo/semantics/conferenceObject
- dc.type.version info:eu-repo/semantics/acceptedVersion