Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/29390
Title: EEG-DBNet: A Dual-Branch Network for Temporal-Spectral Decoding in Motor-Imagery Brain-Computer Interfaces.
Authors: Lou, X
Li, X
Meng, H
Hu, J
Xu, M
Zhao, Y
Yang, J
Li, Z
Keywords: electroencephalogram (EEG);motor imagery (MI);brain-computer interfaces (BCIs);neural networks
Issue Date: 2024
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Citation: Lou, X. et al. (2024) 'EEG-DBNet: A Dual-Branch Network for Temporal-Spectral Decoding in Motor-Imagery Brain-Computer Interfaces.', IEEE Transactions on Neural Systems and Rehabilitation Engineering, 0 (accepted, in press), pp. 1 - 10.
Abstract: Motor imagery electroencephalogram (EEG)-based brain-computer interfaces (BCIs) offer significant advantages for individuals with restricted limb mobility. However, challenges such as low signal-to-noise ratio and limited spatial resolution impede accurate feature extraction from EEG signals, thereby affecting the classification accuracy of different actions. To address these challenges, this study proposes an end-to-end dual-branch network (EEG-DBNet) that decodes the temporal and spectral sequences of EEG signals in parallel through two distinct network branches. Each branch comprises a local convolutional block and a global convolutional block. The local convolutional block transforms the source signal from the temporal-spatial domain to the temporal-spectral domain. By varying the number of filters and convolution kernel sizes, the local convolutional blocks in different branches adjust the length of their respective dimension sequences. Different types of pooling layers are then employed to emphasize the features of various dimension sequences, setting the stage for subsequent global feature extraction. The global convolution block splits and reconstructs the feature of the signal sequence processed by the local convolution block in the same branch and further extracts features through the dilated causal convolutional neural networks. Finally, the outputs from the two branches are concatenated, and signal classification is completed via a fully connected layer. Our proposed method achieves classification accuracies of 85.84% and 91.60% on the BCI Competition 4-2a and BCI Competition 4-2b datasets, respectively, surpassing existing state-of-the-art models.
Description: The source code is available at https://github.com/xicheng105/EEG-DBNet .
A preprint version of the article is available at arXiv:2405.16090v3 [cs.HC], https://arxiv.org/abs/2405.16090 . It has not been certified by peer review.
URI: https://bura.brunel.ac.uk/handle/2438/29390
ISSN: 1534-4320
Other Identifiers: ORCiD: Honying Meng https://orcid.org/0000-0002-8836-1382
CoRR abs/2405.16090
arXiv:2405.16090v3 [cs.HC]
Appears in Collections:Dept of Computer Science Embargoed Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfEmbargoed until publication1.28 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.