Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/32821
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHuang, X-
dc.contributor.authorMeng, H-
dc.contributor.authorLi, Z-
dc.date.accessioned2026-02-18T13:47:54Z-
dc.date.available2026-02-18T13:47:54Z-
dc.date.issued2026-02-12-
dc.identifier.citationHuang, X., Meng, H. and Li, Z. (2026) 'Pre-Ictal EEG Augmentation Based CDCGAN Model for Epileptic Seizure Prediction', Technologies, 14 (2), 114, pp. 1–15. doi: 10.3390/technologies14020114.en_US
dc.identifier.urihttp://bura.brunel.ac.uk/handle/2438/32821-
dc.descriptionData Availability Statement: The CHB-MIT Scalp EEG Database used in this study is publicly available at https://physionet.org/content/chbmit/1.0.0/ (accessed on 13 December 2025).en_US
dc.description.abstractEpilepsy is a common neurological disorder affecting over 50 million people worldwide, characterised by recurrent seizures accompanied by abnormal neuronal electrical activity. Electroencephalogram (EEG) is a technique for recording brain electrical signals, widely employed for epileptic seizure (ES) prediction due to its high temporal resolution, portability, and cost-effectiveness. However, reliable ES prediction based on EEG remains challenging, primarily owing to the limited duration of recorded pre-ictal states in publicly available datasets and the typically low signal-to-noise ratio (SNR) in non-invasive recordings. To mitigate these issues, we propose a Conditional Deep Convolutional Generative Adversarial Network (CDCGAN), which combines the representational power of Deep Convolutional Generative Adversarial Network (DCGAN) with the categorical conditioning mechanism of Conditional Generative Adversarial Network (CGAN) to generate class-specific EEG samples. By synthesising target samples, CDCGAN aims to alleviate class imbalance and enhance the quality of low-resolution spectral representations. To evaluate the practical utility of generated data, we trained a Convolutional Neural Network (CNN) on the augmented dataset and compared its performance against prior studies. Under the Leave-One-Seizure-Out cross-validation (LOSO-CV) protocol, our method achieved an average AUC of 0.876 at a 60% augmentation rate with 50 training epochs. The AUC improvement relative to corresponding control settings demonstrates that GAN-based data augmentation provides additional effective training samples for ES prediction while preserving task-relevant and discriminative pre-ictal EEG features.en_US
dc.description.sponsorshipThis research was funded by the Royal Society (IEC\NSFC\223285) and National Natural Science Foundation of China (General Program) No. 62171073.en_US
dc.format.extent1–15-
dc.format.mediumElectronic-
dc.languageEnglish-
dc.language.isoen_USen_US
dc.publisherMDPIen_US
dc.rightsCreative Commons Attribution 4.0 International-
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.subjectepileptic seizures predictionen_US
dc.subjectelectroencephalogramen_US
dc.subjectbiomedical signal processingen_US
dc.subjectgenerative adversarial networksen_US
dc.subjectdeep learningen_US
dc.titlePre-Ictal EEG Augmentation Based CDCGAN Model for Epileptic Seizure Predictionen_US
dc.typeArticleen_US
dc.identifier.doihttp://dx.doi.org/10.3390/technologies14020114-
dc.relation.isPartOfTechnologies-
pubs.issue2-
pubs.publication-statusPublished online-
pubs.volume14-
dc.identifier.eissn2227-7080-
dc.rights.licensehttps://creativecommons.org/licenses/by/4.0/legalcode.en-
dcterms.dateAccepted2026-02-09-
dc.rights.holderThe authors-
dc.contributor.orcidHuang, Xindi [0009-0005-0580-3886]-
dc.contributor.orcidMeng, Hongying [0000-0002-8836-1382]-
dc.contributor.orcidLi, Zhangyong [0000-0002-3918-069X]-
dc.identifier.number114-
Appears in Collections:Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).5.42 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons