Skip to Main content Skip to Navigation

Reconnaissance automatique de l'émotion à partir de signaux EEG

Abstract : While the usage of electroencephalographic (EEG) recording has been for long confined in the medical field, the recent years have seen a growing interest in EEG-based brain-computer interfaces (BCI) for general public applications. In particular EEG-recording has attracted the attention of researchers in the field of affective computing as part of the effort to perform human-behaviour analysis tasks, especially automatic emotion recognition Compared to other modalities which have been considered in previous work on emotion recognition, such as speech, facial expressions, gestures or other physiological signals, EEG has the advantage of capturing information related to internal emotional states not necessarily resulting in any observable externalmanifestations. Emotion recognition is usually approached as a classification problem where the choice of appropriate features is critical to ensure satisfactory recognition accuracy. As far as EEG-features are concerned, a consensus has not yet been reached as to a standard set of attributes that could guarantee a successful characterization of a human-subject’s emotions. We explore a wide range of temporal, spectral and spatial features potentially useful for emotion recognition, comparing them to previous proposals through a rigorous experimental evaluation. In particular we assess the effectiveness of various spectral features that were not previously envisaged for the problem of classifying emotions. Our results show that the new spectral shape features that we propose are very competitive compared to previously used ones. They are additionally amenable to successful emotion recognition in single-channel setups, which holds a great potential for general public applications. Within the existing and accessible datasets for emotion recognition in affective computing, the aspect of the dynamics of emotion is not being considered. Those datasets also showa lack of variability in the data and contain the records of a limited number of participants. These reasons have led us to propose a new multimodal dataset for the analysis of the emotional state that attaches to respond better to certain weaknesses of the existing datasets. We use different strategies for the emotion elicitation, through the use of both visual and audio-visual stimuli. We also provide an innovative approach to the strategy on the annotation of the emotion experienced by the user by integrating more than the transcript of the emotion globally felt, but also the transcript of its variations. This new dataset will, in a first stage, enhance the validation of the approach we propose in our work. Secondly, in agreement with the encouraging results, it will be possible to consider a longer-term characterization of the dynamics of the emotional state of the user, which opens the way for new models that could predict, for example, the increase in the user’s anxiety according to the situation in which he is placed.
Complete list of metadata

Cited literature [349 references]  Display  Hide  Download
Contributor : ABES STAR :  Contact
Submitted on : Monday, October 5, 2020 - 2:52:08 PM
Last modification on : Wednesday, October 14, 2020 - 4:03:52 AM


Version validated by the jury (STAR)


  • HAL Id : tel-02957939, version 1


Anne-Claire Conneau. Reconnaissance automatique de l'émotion à partir de signaux EEG. Imagerie médicale. Télécom ParisTech, 2016. Français. ⟨NNT : 2016ENST0033⟩. ⟨tel-02957939⟩



Record views


Files downloads