dc.contributor.author |
Gutiérrez Páez, Nicolás |
dc.contributor.author |
Gómez Cañón, Juan Sebastián |
dc.contributor.author |
Porcaro, Lorenzo |
dc.contributor.author |
Santos Rodríguez, Patrícia |
dc.contributor.author |
Hernández Leo, Davinia |
dc.contributor.author |
Gómez Gutiérrez, Emilia, 1975- |
dc.date.accessioned |
2021-09-03T11:49:58Z |
dc.date.available |
2021-09-03T11:49:58Z |
dc.date.issued |
2021 |
dc.identifier.citation |
Gutiérrez Páez NF, Gómez-Cañón JS, Porcaro L, Santos P, Hernández-Leo D, Gómez E. Emotion annotation of music: a citizen science approach. In: Hernández-Leo D, Hishiyama R, Zurita G, Weyers B, Nolte A, Ogata H, editors. Collaboration Technologies and Social Computing. Proceedings of the 27th International Conference, CollabTech 2021; 2021 Aug 31-Sep 3; Cham, Switzerland. Cham: Springer; 2021. p. 51-66. (LNCS; no. 12856). DOI: 10.1007/978-3-030-85071-5_4 |
dc.identifier.isbn |
978-3-030-85070-8 |
dc.identifier.issn |
0302-9743 |
dc.identifier.uri |
http://hdl.handle.net/10230/48384 |
dc.description |
Comunicació presentada a: 27th International Conference, CollabTech 2021 celebrat de manera virtual del 31 d'agost al 3 de setembre de 2021. |
dc.description.abstract |
The understanding of the emotions in music has motivated research across diverse areas of knowledge for decades. In the field of computer science, there is a particular interest in developing algorithms to “predict” the emotions in music perceived by or induced to a listener. However, the gathering of reliable “ground truth” data for modeling the emotional content of music poses challenges, since tasks related with annotations of emotions are time consuming, expensive and cognitively demanding due to its inherent subjectivity and its cross-disciplinary nature. Citizen science projects have proven to be a useful approach to solve these types of problems where there is a need for recruiting collaborators for massive scale tasks. We developed a platform for annotating emotional content in musical pieces following a citizen science approach, to benefit not only the researchers, who benefit from the generated dataset, but also the volunteers, who are engaged to collaborate on the research project, not only by providing annotations but also through their self and community-awareness about the emotional perception of the music. Likewise, gamification mechanisms motivate the participants to explore and discover new music based on the emotional content. Preliminary user evaluations showed that the platform design is in line with the motivations of the general public, and that the citizen science approach offers an iterative refinement to enhance the quantity and quality of contributions by involving volunteers in the design process. The usability of the platform was acceptable, although some of the features require improvements. |
dc.description.sponsorship |
This work has been partially funded by the TROMPA project, European Union’s Horizon 2020 research and innovation programme under grant agreement no. 770376. TIDE-UPF also acknowledges the support by FEDER, the National Research Agency of the Spanish Ministry of Science and Innovation, TIN2017-85179-C3-3-R, PID2020-112584RB-C33, MDM-2015-0502, the Ramon y Cajal programme (P. Santos) and by ICREA under the ICREA Academia programme (D. Hernández-Leo, Serra Hunter). |
dc.format.mimetype |
application/pdf |
dc.language.iso |
eng |
dc.publisher |
Springer |
dc.relation.ispartof |
Hernández-Leo D, Hishiyama R, Zurita G, Weyers B, Nolte A, Ogata H, editors. Collaboration Technologies and Social Computing. Proceedings of the 27th International Conference, CollabTech 2021; 2021 Aug 31-Sep 3; Cham, Switzerland. Cham: Springer; 2021. (LNCS; no. 12856) |
dc.rights |
© Springer Nature Switzerland AG 2021 |
dc.title |
Emotion annotation of music: a citizen science approach |
dc.type |
info:eu-repo/semantics/conferenceObject |
dc.identifier.doi |
http://dx.doi.org/10.1007/978-3-030-85071-5_4 |
dc.subject.keyword |
Citizen science |
dc.subject.keyword |
Crowdsourcing |
dc.subject.keyword |
Collaborative annotation |
dc.subject.keyword |
Music Emotion Recognition |
dc.subject.keyword |
Motivations |
dc.relation.projectID |
info:eu-repo/grantAgreement/EC/H2020/770376 |
dc.relation.projectID |
info:eu-repo/grantAgreement/ES/2PE/TIN2017-85179-C3-3-R |
dc.relation.projectID |
info:eu-repo/grantAgreement/ES/2PE/PID2020-112584RB-C33 |
dc.rights.accessRights |
info:eu-repo/semantics/openAccess |
dc.type.version |
info:eu-repo/semantics/acceptedVersion |