Biosignal-Based Multimodal Emotion Recognition in a Valence-Arousal Affective Framework Applied to Immersive Video Visualization
Fred, A. L. N.
Biosignal-Based Multimodal Emotion Recognition in a Valence-Arousal Affective Framework Applied to Immersive Video Visualization, Proc International Conf. of the IEEE Engineering in Medicine and Biology Society - EMBC, Berlin, Germany, Vol. , pp. 3577 - 3583, July, 2019.
Digital Object Identifier: 10.1109/EMBC.2019.8857852
Many emotion recognition schemes have been proposed in the state-of-the-art. They generally differ in terms of the emotion elicitation methods, target emotional states to recognize, data sources or modalities, and classification tech- niques. In this work several biosignals are explored for emotion assessment during immersive video visualization, collecting multimodal data from Electrocardiography (ECG), Electroder- mal Activity (EDA), Blood Volume Pulse (BVP) and Respiration sensors. Participants reported their emotional state of the day (baseline), and provided self-assessment of the emotion experienced in each video through the Self-Assessment Manikin (SAM), in the valence-arousal space. Multiple physiological and statistical features extracted from the biosignals were used as inputs to an emotion recognition workflow, targeting user-independent classification with two classes per dimension. Support Vector Machines (SVM) were used, as it is considered one of the most promising classifiers in the field. The proposed approach lead to accuracies of 69.13% for arousal and 67.75% for valence, which are encouraging for further research with a larger training dataset and population.