Welcome to the UPF Digital Repository

Classification of complex emotions using EEG and virtual environment: proof of concept and therapeutic implication

Show simple item record

dc.contributor.author De Filippi, Eleonora
dc.contributor.author Wolter, Mara
dc.contributor.author Melo, Bruno R. P.
dc.contributor.author Tierra-Criollo, Carlos J.
dc.contributor.author Bortolini, Tiago
dc.contributor.author Deco, Gustavo
dc.contributor.author Moll, Jorge
dc.date.accessioned 2022-06-21T05:57:15Z
dc.date.available 2022-06-21T05:57:15Z
dc.date.issued 2021
dc.identifier.citation De Filippi E, Wolter M, Melo BRP, Tierra-Criollo CJ, Bortolini T, Deco G, Moll J. Classification of complex emotions using EEG and virtual environment: proof of concept and therapeutic implication. Front Hum Neurosci. 2021;15:711279. DOI: 10.3389/fnhum.2021.711279
dc.identifier.issn 1662-5161
dc.identifier.uri http://hdl.handle.net/10230/53540
dc.description.abstract During the last decades, neurofeedback training for emotional self-regulation has received significant attention from scientific and clinical communities. Most studies have investigated emotions using functional magnetic resonance imaging (fMRI), including the real-time application in neurofeedback training. However, the electroencephalogram (EEG) is a more suitable tool for therapeutic application. Our study aims at establishing a method to classify discrete complex emotions (e.g., tenderness and anguish) elicited through a near-immersive scenario that can be later used for EEG-neurofeedback. EEG-based affective computing studies have mainly focused on emotion classification based on dimensions, commonly using passive elicitation through single-modality stimuli. Here, we integrated both passive and active elicitation methods. We recorded electrophysiological data during emotion-evoking trials, combining emotional self-induction with a multimodal virtual environment. We extracted correlational and time-frequency features, including frontal-alpha asymmetry (FAA), using Complex Morlet Wavelet convolution. Thinking about future real-time applications, we performed within-subject classification using 1-s windows as samples and we applied trial-specific cross-validation. We opted for a traditional machine-learning classifier with low computational complexity and sufficient validation in online settings, the Support Vector Machine. Results of individual-based cross-validation using the whole feature sets showed considerable between-subject variability. The individual accuracies ranged from 59.2 to 92.9% using time-frequency/FAA and 62.4 to 92.4% using correlational features. We found that features of the temporal, occipital, and left-frontal channels were the most discriminative between the two emotions. Our results show that the suggested pipeline is suitable for individual-based classification of discrete emotions, paving the way for future personalized EEG-neurofeedback training.
dc.format.mimetype application/pdf
dc.language.iso eng
dc.publisher Frontiers
dc.relation.ispartof Frontiers in human neuroscience. 2021;15:711279.
dc.rights © 2021 De Filippi,Wolter,Melo, Tierra-Criollo, Bortolini, Deco andMoll. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms
dc.rights.uri https://creativecommons.org/licenses/by/4.0/
dc.title Classification of complex emotions using EEG and virtual environment: proof of concept and therapeutic implication
dc.type info:eu-repo/semantics/article
dc.identifier.doi http://doi.org/10.3389/fnhum.2021.711279
dc.subject.keyword emotions
dc.subject.keyword electroencephalography
dc.subject.keyword classification
dc.subject.keyword machine-learning
dc.subject.keyword neuro-feedback
dc.subject.keyword multimodal virtual scenario
dc.rights.accessRights info:eu-repo/semantics/openAccess
dc.type.version info:eu-repo/semantics/publishedVersion

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Advanced Search

Browse

My Account

Statistics

In collaboration with Compliant to Partaking