This work presents an initial proof of concept of how Music Emotion Recognition (MER) systems could be intentionally biased with respect to annotations
of musically-induced emotions in a political context. In specific, we analyze
traditional Colombian music containing politically-charged lyrics of two types:
(1) vallenatos and social songs from the “left-wing” guerrilla Fuerzas Armadas
Revolucionarias de Colombia (FARC) and (2) corridos from the “right-wing”
paramilitaries Autodefensas Unidas ...
This work presents an initial proof of concept of how Music Emotion Recognition (MER) systems could be intentionally biased with respect to annotations
of musically-induced emotions in a political context. In specific, we analyze
traditional Colombian music containing politically-charged lyrics of two types:
(1) vallenatos and social songs from the “left-wing” guerrilla Fuerzas Armadas
Revolucionarias de Colombia (FARC) and (2) corridos from the “right-wing”
paramilitaries Autodefensas Unidas de Colombia (AUC). We train personalized
machine learning models to predict induced emotions for three users with diverse
political views – we aim at identifying the songs that may induce negative emotions
for a particular user, such as anger and fear. To this extent, a user’s emotion judgements could be interpreted as problematizing data – subjective emotional judgments
could in turn be used to influence the user in a human-centered machine learning
environment. In short, highly desired “emotion regulation” applications could
potentially deviate to “emotion manipulation” – the recent discredit of emotion
recognition technologies might transcend ethical issues of diversity and inclusion.
+