Finger-string interaction analysis in guitar playing with optical motion capture

Mostra el registre complet Registre parcial de l'ítem

  • dc.contributor.author Pérez Carrillo, Alfonso Antonio, 1977-
  • dc.date.accessioned 2023-01-17T09:10:20Z
  • dc.date.available 2023-01-17T09:10:20Z
  • dc.date.issued 2019
  • dc.description.abstract We present a method for the analysis of the finger-string interaction in guitar performances and the computation of fine actions during the plucking gesture. The method is based on Motion Capture using high-speed cameras that can track the position of reflective markers placed on the guitar and fingers, in combination with audio analysis. A major problem inherent in optical motion capture is that of marker occlusion and, in guitar playing, it is the right hand of the guitarist that is extremely difficult to capture, especially during the plucking process, where the track of the markers at the fingertips is lost very frequently. This work presents two models that allow the reconstruction of the position of occluded markers: a rigid-body model to track the motion of the guitar strings and a flexible-body model to track the motion of the hands. In combination with audio analysis (onset and pitch detection), the method can estimate a comprehensive set of sound control features that include the plucked string, the plucking finger, and the characteristics of the plucking gesture in the phases of contact, pressure and release (e.g., position, timing, velocity, direction, or string displacement).
  • dc.description.sponsorship This work has been sponsored by the European Union Horizon 2020 research and innovation program under grant agreement No. 688269 (TELMI project) and Beatriu de Pinos grant 2010 BP-A 00209 by the Catalan Research Agency (AGAUR).
  • dc.format.mimetype application/pdf
  • dc.identifier.citation Perez-Carrillo A. Finger-string interaction analysis in guitar playing with optical motion capture. Front Comput Sci. 2019 Nov 15;1:8. DOI: 10.3389/fcomp.2019.00008
  • dc.identifier.doi http://dx.doi.org/10.3389/fcomp.2019.00008
  • dc.identifier.issn 2624-9898
  • dc.identifier.uri http://hdl.handle.net/10230/55306
  • dc.language.iso eng
  • dc.publisher Frontiers
  • dc.relation.ispartof Frontiers in Computer Science. 2019 Nov 15;1:8
  • dc.relation.projectID info:eu-repo/grantAgreement/EC/H2020/688269
  • dc.rights © 2019 Perez-Carrillo. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
  • dc.rights.accessRights info:eu-repo/semantics/openAccess
  • dc.rights.uri http://creativecommons.org/licenses/by/4.0/
  • dc.subject.keyword Motion capture
  • dc.subject.keyword Guitar performance
  • dc.subject.keyword Hand model
  • dc.subject.keyword Audio analysis
  • dc.subject.keyword Neural networks
  • dc.title Finger-string interaction analysis in guitar playing with optical motion capture
  • dc.type info:eu-repo/semantics/article
  • dc.type.version info:eu-repo/semantics/publishedVersion