INVESTIGADORES
CAIAFA Cesar Federico
artículos
Título:
A multimodal emotion recognition method based on facial expressions and electroencephalography
Autor/es:
TAN, YING; SUN, ZHE; DUAN, FENG; SOLÉ-CASALS, JORDI; CAIAFA, CESAR F.
Revista:
BIOMEDICAL SIGNAL PROCESSING AND CONTROL
Editorial:
ELSEVIER SCI LTD
Referencias:
Año: 2021 vol. 70
ISSN:
1746-8094
Resumen:
Human-robot interaction (HRI) systems play a critical role in society. However, most HRI systems nowadays still face the challenge of disharmony, resulting in an inefficient communication between the human and the robot. In this paper, a multimodal emotion recognition method is proposed to establish an HRI system with a low sense of disharmony. This method is based on facial expressions and electroencephalography (EEG). The image classification method of facial expressions and the suitable feature extraction method of EEG were investigated based on the public datasets. And then these methods were applied to both images and EEG data acquired by ourselves. In addition, the Monte Carlo method was used to merge the results and solve the problem of having a small dataset. The multimodal emotion recognition method was combined with the HRI system, where it achieved a recognition rate of 83.33%. Furthermore, in order to evaluate the HRI system from the user´s point of view, a perceptual assessment method was proposed to evaluate our system, in which the system was scored by the participants based on their experience, achieving an average score of 7 (the scores were ranged from 0 to 10). Experimental results demonstrate the effectiveness and feasibility of the multimodal emotion recognition method, which can be useful to reduce the sense of disharmony of HRI systems.