Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/19498
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYan, F-
dc.contributor.authorHuang, X-
dc.contributor.authorYao, Y-
dc.contributor.authorLu, M-
dc.contributor.authorLi, M-
dc.date.accessioned2019-11-05T16:29:27Z-
dc.date.available2019-11-05T16:29:27Z-
dc.date.issued2019-06-03-
dc.identifierORCiD: Xin Huang https://orcid.org/0000-0002-5470-1203-
dc.identifierORCiD: Mingming Lu https://orcid.org/0000-0002-4762-1280-
dc.identifierORCiD: Maozhen Li https://orcid.org/0000-0002-0820-5487-
dc.identifier.citationYan, F. et al. (2019) 'Combining LSTM and DenseNet for Automatic Annotation and Classification of Chest X-Ray Images', IEEE Access, 7, pp. 74181 - 74189. doi: 10.1109/ACCESS.2019.2920397.en_US
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/19499-
dc.description.abstractThe chest X-ray is a simple and economical medical aid for auxiliary diagnosis and therefore has become a routine item for residents' physical examinations. Based on 40167 images of chest radiographs and corresponding reports, we explore the abnormality classification problem of chest X-rays by taking advantage of deep learning techniques. First of all, since the radiology reports are generally templatized by the aberrant physical regions, we propose an annotation method according to the abnormal part in the images. Second, building on a small number of reports that are manually annotated by professional radiologists, we employ the long short-term memory (LSTM) model to automatically annotate the remaining unlabeled data. The result shows that the precision value reaches 0.88 in accurately annotating images, the recall value reaches 0.85, and the F1-score reaches 0.86. Finally, we classify the abnormality in the chest X-rays by training convolutional neural networks, and the results show that the average AUC value reaches 0.835.en_US
dc.description.sponsorship10.13039/501100003399-Science and Technology Commission of Shanghai Municipality (Grant Number: 16511102800); Fundamental Research Funds for the Central Universities (Grant Number: 22120180117).-
dc.format.extent74181 - 74189-
dc.format.mediumElectronic-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.rightsCopyright © 2019 IEEE. Open Access. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. -3536 © 2019 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications_standards/publications/rights/index.html for more information.-
dc.rights.urihttps://www.ieee.org/publications_standards/publications/rights/index.html for more information.-
dc.subjectannotation,en_US
dc.subjectdeep neural network,en_US
dc.subjectDenseNet,en_US
dc.subjectlong short term memoryen_US
dc.titleCombining LSTM and DenseNet for Automatic Annotation and Classification of Chest X-Ray Imagesen_US
dc.typeArticleen_US
dc.date.dateAccepted2019-05-26-
dc.identifier.doihttps://doi.org/10.1109/ACCESS.2019.2920397-
dc.relation.isPartOfIEEE Access-
pubs.publication-statusPublished-
pubs.volume7-
dc.identifier.eissn2169-3536-
dc.rights.holderIEEE-
Appears in Collections:Publications
Publications
Dept of Electronic and Electrical Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdfCopyright © 2019 IEEE. Open Access. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. -3536 © 2019 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See https://www.ieee.org/publications_standards/publications/rights/index.html for more information.6.58 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.