Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/30386
Title: Selective Auditory Attention Detection Using Combined Transformer and Convolutional Graph Neural Networks
Authors: Geravanchizadeh, M
Shaygan Asl, A
Danishvar, S
Keywords: selective auditory attention detection;graph neural network;transformer;convolutional neural networks;brain connectivity;hybrid neural networks
Issue Date: 30-Nov-2024
Publisher: MDPI
Citation: Geravanchizadeh, M. et al. (2024) 'Selective Auditory Attention Detection Using Combined Transformer and Convolutional Graph Neural Networks', Bioengineering, 11 (12), 1216, pp. 1 - 15. doi: 10.3390/bioengineering11121216.
Abstract: Attention is one of many human cognitive functions that are essential in everyday life. Given our limited processing capacity, attention helps us focus only on what matters. Focusing attention on one speaker in an environment with many speakers is a critical ability of the human auditory system. This paper proposes a new end-to-end method based on the combined transformer and graph convolutional neural network (TraGCNN) that can effectively detect auditory attention from electroencephalograms (EEGs). This approach eliminates the need for manual feature extraction, which is often time-consuming and subjective. Here, the first EEG signals are converted to graphs. We then extract attention information from these graphs using spatial and temporal approaches. Finally, our models are trained with these data. Our model can detect auditory attention in both the spatial and temporal domains. Here, the EEG input is first processed by transformer layers to obtain a sequential representation of EEG based on attention onsets. Then, a family of graph convolutional layers is used to find the most active electrodes using the spatial position of electrodes. Finally, the corresponding EEG features of active electrodes are fed into the graph attention layers to detect auditory attention. The Fuglsang 2020 dataset is used in the experiments to train and test the proposed and baseline systems. The new TraGCNN approach, as compared with state-of-the-art attention classification methods from the literature, yields the highest performance in terms of accuracy (80.12%) as a classification metric. Additionally, the proposed model results in higher performance than our previously graph-based model for different lengths of EEG segments. The new TraGCNN approach is advantageous because attenuation detection is achieved from EEG signals of subjects without requiring speech stimuli, as is the case with conventional auditory attention detection methods. Furthermore, examining the proposed model for different lengths of EEG segments shows that the model is faster than our previous graph-based detection method in terms of computational complexity. The findings of this study have important implications for the understanding and assessment of auditory attention, which is crucial for many applications, such as brain–computer interface (BCI) systems, speech separation, and neuro-steered hearing aid development.
Description: Data Availability Statement: The original data presented in the study are openly available in [25] Fuglsang, S.A.; Märcher-Rørsted, J.; Dau, T.; Hjortkjær, J. Effects of sensorineural hearing loss on cortical synchronization to competing speech during selective attention. J. Neurosci. 2020, 40, 2562–2572..
URI: https://bura.brunel.ac.uk/handle/2438/30386
DOI: https://doi.org/10.3390/bioengineering11121216
Other Identifiers: ORCiD: Masoud Geravanchizadeh https://orcid.org/0000-0003-3413-4966
ORCiD: Amir Shaygan Asl https://orcid.org/0009-0002-2672-0762
ORCiD: Sebelan Danishvar https://orcid.org/0000-0002-8258-0437
1216
Appears in Collections:Dept of Civil and Environmental Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).2.02 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons