Skip to main content

ORIGINAL RESEARCH article

Front. Hum. Neurosci.

Sec. Brain-Computer Interfaces

Volume 19 - 2025 | doi: 10.3389/fnhum.2025.1445763

This article is part of the Research Topic Exploration of Artificial Intelligence for Emotion Recognition Using Physiological Signals View all 3 articles

Cross-subject affective analysis based on dynamic brain functional networks

Provisionally accepted
  • 1 South China Normal University, Guangzhou, China
  • 2 Nanyang Technological University, Singapore, Singapore

The final, formatted version of the article will be published soon.

    Emotion recognition is crucial in facilitating human-computer emotional interaction. To enhance the credibility and realism of emotion recognition, researchers have turned to physiological signals, particularly EEG signals, as they directly reflect cerebral cortex activity. However, due to inter-subject variability and non-smoothness of EEG signals, the generalization performance of models across subjects remains a challenge. In this study, we proposed a novel approach that combines time-frequency analysis and brain functional networks to construct dynamic brain functional networks using sliding time windows. This integration of time, frequency, and spatial domains helps to effectively capture features, reducing inter-individual differences, and improving model generalization performance. To construct brain functional networks, we employed mutual information to quantify the correlation between EEG channels and set appropriate thresholds. We then extracted three network attribute features -global efficiency, local efficiency, and local clustering coefficients -to achieve emotion classification based on dynamic brain network features. The proposed method is evaluated on the DEAP dataset through subject-dependent (trial-independent), subject-independent, and subject-and trial-independent experiments along both valence and arousal dimensions. The results demonstrate that our dynamic brain functional network outperforms the static brain functional network in all three experimental cases. High classification accuracies of 90.89% and 91.17% in the valence and arousal dimensions, respectively, were achieved on the subject-independent experiments based on the dynamic brain function, leading to significant advancements in EEG-based emotion recognition. In addition, experiments with each brain region yielded that the left and right temporal lobes focused on processing individual private emotional information, whereas the remaining brain regions paid attention to processing basic emotional information.

    Keywords: EEG, emotion recognition, Dynamic brain function network, subject independence, subject and trial independence

    Received: 08 Jun 2024; Accepted: 18 Mar 2025.

    Copyright: © 2025 You, Liu, He, Zhong and Zhong. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence:
    Tianyu Zhong, Nanyang Technological University, Singapore, 639798, Singapore
    Qinghua Zhong, South China Normal University, Guangzhou, China

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

    Research integrity at Frontiers

    Man ultramarathon runner in the mountains he trains at sunset

    95% of researchers rate our articles as excellent or good

    Learn more about the work of our research integrity team to safeguard the quality of each article we publish.


    Find out more