
94% of researchers rate our articles as excellent or good
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.
Find out more
CORRECTION article
Front. Neurosci. , 28 November 2023
Sec. Auditory Cognitive Neuroscience
Volume 17 - 2023 | https://doi.org/10.3389/fnins.2023.1283644
This article is a correction to:
Multi-channel EEG emotion recognition through residual graph attention neural network
A corrigendum on
Multi-channel EEG emotion recognition through residual graph attention neural network
Chao H, Cao Y and Liu Y (2023). Front. Neurosci. 17:1135850. doi: 10.3389/fnins.2023.1135850
In the published article, there was an error in the Funding statement. The fund name is incorrect and the fund number is missing. The original text is: Programs for Science and Technology Development of Henan province. The correct Funding statement appears below.
Programs for Science and Technology Development of Henan province No. 222102210078.
The authors apologize for this error and state that this does not change the scientific conclusions of the article in any way. The original article has been updated.
All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article, or claim that may be made by its manufacturer, is not guaranteed or endorsed by the publisher.
Keywords: EEG, emotion recognition, residual network, graph attention neural network, feature fusion
Citation: Chao H, Cao Y and Liu Y (2023) Corrigendum: Multi-channel EEG emotion recognition using Residual Graph Attention Neural Network. Front. Neurosci. 17:1283644. doi: 10.3389/fnins.2023.1283644
Received: 26 August 2023; Accepted: 13 November 2023;
Published: 28 November 2023.
Approved by:
Frontiers Editorial Office, Frontiers Media SA, SwitzerlandCopyright © 2023 Chao, Cao and Liu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
*Correspondence: Hao Chao, Y2hhb2hhbzE5ODFAMTYzLmNvbQ==
Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.
Research integrity at Frontiers
Learn more about the work of our research integrity team to safeguard the quality of each article we publish.