Deep fusion of multi-channel neurophysiological signal for emotion recognition and monitoring

Xiang Li, Dawei Song*, Peng Zhang, Yuexian Hou, Bin Hu

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

28 Citations (Scopus)

Abstract

How to fuse multi-channel neurophysiological signals for emotion recognition is emerging as a hot research topic in community of Computational Psychophysiology. Nevertheless, prior feature engineering based approaches require extracting various domain knowledge related features at a high time cost. Moreover, traditional fusion method cannot fully utilise correlation information between different channels and frequency components. In this paper, we design a hybrid deep learning model, in which the 'Convolutional Neural Network (CNN)' is utilised for extracting task-related features, as well as mining inter-channel and inter-frequency correlation, besides, the 'Recurrent Neural Network (RNN)' is concatenated for integrating contextual information from the frame cube sequence. Experiments are carried out in a trial-level emotion recognition task, on the DEAP benchmarking dataset. Experimental results demonstrate that the proposed framework outperforms the classical methods, with regard to both of the emotional dimensions of Valence and Arousal.

Original languageEnglish
Pages (from-to)1-27
Number of pages27
JournalInternational Journal of Data Mining and Bioinformatics
Volume18
Issue number1
DOIs
Publication statusPublished - 2017
Externally publishedYes

Keywords

  • Affective computing
  • CNN
  • EEG
  • Emotion recognition
  • LSTM
  • Multi-channel data fusion
  • Multi-modal data fusion
  • Physiological signal
  • RNN
  • Time series data analysis

Fingerprint

Dive into the research topics of 'Deep fusion of multi-channel neurophysiological signal for emotion recognition and monitoring'. Together they form a unique fingerprint.

Cite this