TY - JOUR
T1 - Deep fusion of multi-channel neurophysiological signal for emotion recognition and monitoring
AU - Li, Xiang
AU - Song, Dawei
AU - Zhang, Peng
AU - Hou, Yuexian
AU - Hu, Bin
N1 - Publisher Copyright:
Copyright © 2017 Inderscience Enterprises Ltd.
PY - 2017
Y1 - 2017
N2 - How to fuse multi-channel neurophysiological signals for emotion recognition is emerging as a hot research topic in community of Computational Psychophysiology. Nevertheless, prior feature engineering based approaches require extracting various domain knowledge related features at a high time cost. Moreover, traditional fusion method cannot fully utilise correlation information between different channels and frequency components. In this paper, we design a hybrid deep learning model, in which the 'Convolutional Neural Network (CNN)' is utilised for extracting task-related features, as well as mining inter-channel and inter-frequency correlation, besides, the 'Recurrent Neural Network (RNN)' is concatenated for integrating contextual information from the frame cube sequence. Experiments are carried out in a trial-level emotion recognition task, on the DEAP benchmarking dataset. Experimental results demonstrate that the proposed framework outperforms the classical methods, with regard to both of the emotional dimensions of Valence and Arousal.
AB - How to fuse multi-channel neurophysiological signals for emotion recognition is emerging as a hot research topic in community of Computational Psychophysiology. Nevertheless, prior feature engineering based approaches require extracting various domain knowledge related features at a high time cost. Moreover, traditional fusion method cannot fully utilise correlation information between different channels and frequency components. In this paper, we design a hybrid deep learning model, in which the 'Convolutional Neural Network (CNN)' is utilised for extracting task-related features, as well as mining inter-channel and inter-frequency correlation, besides, the 'Recurrent Neural Network (RNN)' is concatenated for integrating contextual information from the frame cube sequence. Experiments are carried out in a trial-level emotion recognition task, on the DEAP benchmarking dataset. Experimental results demonstrate that the proposed framework outperforms the classical methods, with regard to both of the emotional dimensions of Valence and Arousal.
KW - Affective computing
KW - CNN
KW - EEG
KW - Emotion recognition
KW - LSTM
KW - Multi-channel data fusion
KW - Multi-modal data fusion
KW - Physiological signal
KW - RNN
KW - Time series data analysis
UR - http://www.scopus.com/inward/record.url?scp=85028704332&partnerID=8YFLogxK
U2 - 10.1504/IJDMB.2017.10007183
DO - 10.1504/IJDMB.2017.10007183
M3 - Article
AN - SCOPUS:85028704332
SN - 1748-5673
VL - 18
SP - 1
EP - 27
JO - International Journal of Data Mining and Bioinformatics
JF - International Journal of Data Mining and Bioinformatics
IS - 1
ER -