Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/19498
Title: Combining LSTM and DenseNet for Automatic Annotation and Classification of Chest X-Ray Images
Authors: Yan, F
Huang, X
Yao, Y
Lu, M
Li, M
Keywords: Annotation,;deep neural network,;DenseNet,;long short term memory
Issue Date: 3-Jun-2019
Publisher: Institute of Electrical and Electronics Engineers
Citation: IEEE Access, 2019, 7 pp. 74181 - 74189
Abstract: The chest X-ray is a simple and economical medical aid for auxiliary diagnosis and therefore hasbecomearoutineitemforresidents’physicalexaminations.Basedon40167imagesofchestradiographs and corresponding reports, we explore the abnormality classification problem of chest X-rays by taking advantage of deep learning techniques. First of all, since the radiology reports are generally templatized by theaberrantphysicalregions,weproposeanannotationmethodaccordingtotheabnormalpartintheimages. Second, building on a small number of reports that are manually annotated by professional radiologists, we employ the long short-term memory (LSTM) model to automatically annotate the remaining unlabeled data.Theresultshowsthattheprecisionvaluereaches0.88inaccuratelyannotatingimages,therecallvalue reaches 0.85, and the F1-score reaches 0.86. Finally, we classify the abnormality in the chest X-rays by training convolutional neural networks, and the results show that the average AUC value reaches 0.835.
URI: http://bura.brunel.ac.uk/handle/2438/19499
DOI: http://dx.doi.org/10.1109/ACCESS.2019.2920397
ISSN: 2169-3536
http://dx.doi.org/10.1109/ACCESS.2019.2920397
Appears in Collections:Publications
Publications

Files in This Item:
File Description SizeFormat 
Fulltext.pdf6.58 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.