Skip to main content

ORIGINAL RESEARCH article

Front. Neurosci.
Sec. Brain Imaging Methods
Volume 18 - 2024 | doi: 10.3389/fnins.2024.1436619
This article is part of the Research Topic Deep Learning Methods and Applications in Brain Imaging for the Diagnosis of Neurological and Psychiatric Disorders View all 10 articles

An epilepsy classification based on FFT and fully convolutional neural network nested LSTM

Provisionally accepted
Jianhao Nie Jianhao Nie Huazhong Shu Huazhong Shu *Fuzhi Wu Fuzhi Wu
  • Southeast University, Nanjing, China

The final, formatted version of the article will be published soon.

    Background and Objective: Epilepsy, which is associated with neuronal damage and functional decline, typically presents patients with numerous challenges in their daily lives. An early diagnosis plays a crucial role in managing the condition and alleviating the patients' suffering. Electroencephalogram (EEG)-based approaches are commonly employed for diagnosing epilepsy due to their effectiveness and non-invasiveness. In this study, a classification method is proposed that use fast Fourier Transform (FFT) extraction in conjunction with convolutional neural networks (CNN) and long shortterm memory (LSTM) models. Methods: Most methods use traditional frameworks to classify epilepsy, we propose a new approach to this problem by extracting features from the source data and then feeding them into a network for training and recognition. It preprocesses the source data into training and validation data and then uses CNN and LSTM to classify the style of the data.Results: Upon analyzing a public test dataset, the top-performing features in the fully CNN nested LSTM model for epilepsy classification are FFT features among three types of features. Notably, all conducted experiments yielded high accuracy rates, with values exceeding 96% for accuracy, 93% for sensitivity, and 96% for specificity. These results are further benchmarked against current methodologies, showcasing consistent and robust performance across all trials. Our approach consistently achieves an accuracy rate surpassing 97.00%, with values ranging from 97.95% to 99.83% in individual experiments. Particularly noteworthy is the superior accuracy of our method in the AB versus (vs.) CDE comparison, registering at 99.06%.Our method exhibits precise classification abilities distinguishing between epileptic and non-epileptic individuals, irrespective of whether the participant's eyes are closed or open. Furthermore, our technique shows remarkable performance in 删除了: 6578 effectively categorizing epilepsy type, distinguishing between epileptic ictal and interictal states versus non-epileptic conditions. An inherent advantage of our automated classification approach is its capability to disregard EEG data acquired during states of eye closure or eye-opening. Such innovation holds promise for realworld applications, potentially aiding medical professionals in diagnosing epilepsy more efficiently.

    Keywords: Electroencephalogram, Fast Fourier transformation, Seizure detection, Convolutional Neural Network, Long-short term memory

    Received: 22 May 2024; Accepted: 16 Jul 2024.

    Copyright: © 2024 Nie, Shu and Wu. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

    * Correspondence: Huazhong Shu, Southeast University, Nanjing, China

    Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.