Design of a Multimodal Interface based on Psychophysiological Sensing to Identify Emotion
Design of a Multimodal Interface based on Psychophysiological Sensing to Identify Emotion, Proc IMEKO TC4 Symp., Iasi, Romania, Vol. 1, pp. 1 - 6, September, 2017.
Digital Object Identifier:
This work proposes a design of a multimodal interface to classify or estimate emotion states. Thus, 7 emotions are considered such as: anger, boredom, disgust, anxiety/fear, happiness, sadness and normal. A couple of sensing technologies such as: galvanic skin response (GSR), heart rate (HR), electrocardiography (ECG), oxygen saturation (SpO2) and electroencephalography (EEG) are used to collect psychophysiological signals in relation with emotion state estimation. The International Affective Picture System (IAPS) dataset is used to design the classifier system. Regarding the classification task, a comparison between artificial neural networks (ANN-MLP) and support vector machine (SVM) is presented. The tests were carried out for 20 healthy volunteers (N_v=20) of both genders with age from 23-50 years old. The proposed classifier presents accuracies of 85.71% when using ANN-MLP and 77.14% when using SVM.