Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/30106
Title: Specialized gray matter segmentation via a generative adversarial network: application on brain white matter hyperintensities classification
Authors: Bawil, MB
Shamsi, M
Bavil, AS
Danishvar, S
Keywords: gray matter segmentation;deep learning;conditional generative adversarial network;white matter hyperintensities;juxtacortical WMH;WMH classification;MRI images;multiple sclerosis
Issue Date: 30-Sep-2024
Publisher: Frontiers Media
Citation: Bawil, M.B. et al. (2024) 'Specialized gray matter segmentation via a generative adversarial network: application on brain white matter hyperintensities classification', Frontiers in Neuroscience, 2024, 18. 1416174, pp. 1 - 14. doi: 10.3389/fnins.2024.1416174.
Abstract: Background: White matter hyperintensities (WMH) observed in T2 fluid-attenuated inversion recovery (FLAIR) images have emerged as potential markers of neurodegenerative diseases like Multiple Sclerosis (MS). Lacking comprehensive automated WMH classification systems in current research, there is a need to develop accurate detection and classification methods for WMH that will benefit the diagnosis and monitoring of brain diseases. Objective: Juxtacortical WMH (JCWMH) is a less explored subtype of WMH, primarily due to the hard definition of the cortex in FLAIR images, which is escalated by the presence of lesions to obtain appropriate gray matter (GM) masks. Methods: In this study, we present a method to perform a specialized GM segmentation developed for the classification of WMH, especially JCWMH. Using T1 and FLAIR images, we propose a pipeline to integrate masks of white matter, cerebrospinal fluid, ventricles, and WMH to create a unique mask to refine the primary GM map. Subsequently, we utilize this pipeline to generate paired data for training a conditional generative adversarial network (cGAN) to substitute the pipeline and reduce the inputs to only FLAIR images. The classification of WMH is then based on the distances between WMH and ventricular and GM masks. Due to the lack of multi-class labeled WMH datasets and the need for extensive data for training deep learning models, we attempted to collect a large local dataset and manually segment and label some data for WMH and ventricles. Results: In JCWMH classification, the proposed method exhibited a Dice similarity coefficient, precision, and sensitivity of 0.76, 0.69, and 0.84, respectively. With values of 0.66, 0.55, and 0.81, the proposed method clearly outperformed the approach commonly used in the literature, which uses extracted GM masks from registered T1 images on FLAIR. Conclusion: After training, the method proves its efficiency by providing results in less than one second. In contrast, the usual approach would require at least two minutes for registration and segmentation alone. The proposed method is automated and fast and requires no initialization as it works exclusively with FLAIR images. Such innovative methods will undoubtedly facilitate accurate and meaningful analysis of WMH in clinical practice by reducing complexity and increasing efficiency.
Description: Data availability statement: The original contributions presented in the study are included in the article/supplementary material, further inquiries can be directed to the corresponding author.
URI: https://bura.brunel.ac.uk/handle/2438/30106
DOI: https://doi.org/10.3389/fnins.2024.1416174
ISSN: 1662-4548
Other Identifiers: ORCiD: Mahdi Bashiri Bawil, https://orcid.org/0009-0002-2029-3245
ORCiD: Mousa Shamsi, https://orcid.org/0000-0003-4670-0531
ORCD: Abolhassan Shakeri Bavil, https://orcid.org/0000-0001-9397-0484
1416174
Appears in Collections:Dept of Civil and Environmental Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2024 Bawil, Shamsi, Bavil and Danishvar. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.2.98 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons