AUTHOR=Yang Fu , Zhao Xingcong , Jiang Wenge , Gao Pengfei , Liu Guangyuan TITLE=Multi-method Fusion of Cross-Subject Emotion Recognition Based on High-Dimensional EEG Features JOURNAL=Frontiers in Computational Neuroscience VOLUME=13 YEAR=2019 URL=https://www.frontiersin.org/journals/computational-neuroscience/articles/10.3389/fncom.2019.00053 DOI=10.3389/fncom.2019.00053 ISSN=1662-5188 ABSTRACT=
Emotion recognition using electroencephalogram (EEG) signals has attracted significant research attention. However, it is difficult to improve the emotional recognition effect across subjects. In response to this difficulty, in this study, multiple features were extracted for the formation of high-dimensional features. Based on the high-dimensional features, an effective method for cross-subject emotion recognition was then developed, which integrated the significance test/sequential backward selection and the support vector machine (ST-SBSSVM). The effectiveness of the ST-SBSSVM was validated on a dataset for emotion analysis using physiological signals (DEAP) and the SJTU Emotion EEG Dataset (SEED). With respect to high-dimensional features, the ST-SBSSVM average improved the accuracy of cross-subject emotion recognition by 12.4% on the DEAP and 26.5% on the SEED when compared with common emotion recognition methods. The recognition accuracy obtained using ST-SBSSVM was as high as that obtained using sequential backward selection (SBS) on the DEAP dataset. However, on the SEED dataset, the recognition accuracy increased by ~6% using ST-SBSSVM from that using the SBS. Using the ST-SBSSVM, ~97% (DEAP) and 91% (SEED) of the program runtime was eliminated when compared with the SBS. Compared with recent similar works, the method developed in this study for emotion recognition across all subjects was found to be effective, and its accuracy was 72% (DEAP) and 89% (SEED).