Welcome to the UPF Digital Repository

Mapping by observation: building a user-tailored conducting system from spontaneous movements

Show simple item record

dc.contributor.author Sarasúa, Álvaro
dc.contributor.author Urbano, Julián
dc.contributor.author Gómez Gutiérrez, Emilia, 1975-
dc.date.accessioned 2019-05-23T14:21:39Z
dc.date.available 2019-05-23T14:21:39Z
dc.date.issued 2019
dc.identifier.citation Sarasúa A, Urbano J, Gomez E. Mapping by observation: building a user-tailored conducting system from spontaneous movements. Front Digit Humanit. 2019;6:3. DOI: 10.3389/fdigh.2019.00003
dc.identifier.issn 2297-2668
dc.identifier.uri http://hdl.handle.net/10230/37283
dc.description.abstract Metaphors are commonly used in interface design within Human-Computer Interaction (HCI). Interface metaphors provide users with a way to interact with the computer that resembles a known activity, giving instantaneous knowledge or intuition about how the interaction works. A widely used one in Digital Musical Instruments (DMIs) is the conductor-orchestra metaphor, where the orchestra is considered as an instrument controlled by the movements of the conductor. We propose a DMI based on the conductor metaphor that allows to control tempo and dynamics and adapts its mapping specifically for each user by observing spontaneous conducting movements (i.e., movements performed on top of fixed music without any instructions). We refer to this as mapping by observation given that, even though the system is trained specifically for each user, this training is not done explicitly and consciously by the user. More specifically, the system adapts its mapping based on the tendency of the user to anticipate or fall behind the beat and observing the Motion Capture descriptors that best correlate to loudness during spontaneous conducting. We evaluate the proposed system in an experiment with twenty four (24) participants where we compare it with a baseline that does not perform this user-specific adaptation. The comparison is done in a context where the user does not receive instructions and, instead, is allowed to discover by playing. We evaluate objective and subjective measures from tasks where participants have to make the orchestra play at different loudness levels or in synchrony with a metronome. Results of the experiment prove that the usability of the system that automatically learns its mapping from spontaneous movements is better both in terms of providing a more intuitive control over loudness and a more precise control over beat timing. Interestingly, the results also show a strong correlation between measures taken from the data used for training and the improvement introduced by the adapting system. This indicates that it is possible to estimate in advance how useful the observation of spontaneous movements is to build user-specific adaptations. This opens interesting directions for creating more intuitive and expressive DMIs, particularly in public installations.
dc.description.sponsorship This work was supported by the European Union Seventh Framework Programme FP7 / 2007-2013 through the PHENICX project (grant agreement no. 601166) and by the CASAS project (TIN2015-70816-R).
dc.format.mimetype application/pdf
dc.language.iso eng
dc.publisher Frontiers
dc.relation.ispartof Frontiers in digital humanities. 2019;6:3.
dc.rights © 2019 Sarasúa, Urbano and Gómez. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.title Mapping by observation: building a user-tailored conducting system from spontaneous movements
dc.type info:eu-repo/semantics/article
dc.identifier.doi http://dx.doi.org/10.3389/fdigh.2019.00003
dc.subject.keyword HCI
dc.subject.keyword Digital music
dc.subject.keyword Motion-sound mapping
dc.subject.keyword Kinect
dc.subject.keyword Conducting
dc.subject.keyword Machine learning
dc.subject.keyword Digital musical instruments
dc.relation.projectID info:eu-repo/grantAgreement/EC/FP7/601166
dc.relation.projectID info:eu-repo/grantAgreement/ES/1PE/TIN2015-70816-R
dc.rights.accessRights info:eu-repo/semantics/openAccess
dc.type.version info:eu-repo/semantics/publishedVersion

This item appears in the following Collection(s)

Show simple item record

Search DSpace

Advanced Search


My Account


Compliant to Partaking