Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/30251
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAlattal, D-
dc.contributor.authorAzar, AK-
dc.contributor.authorMyles, P-
dc.contributor.authorBranson, R-
dc.contributor.authorAbdulhussein, H-
dc.contributor.authorTucker, A-
dc.date.accessioned2024-11-26T10:24:02Z-
dc.date.available2024-11-26T10:24:02Z-
dc.date.issued2025-05-10-
dc.identifierORCiD: Allan Tucker https://orcid.org/0000-0001-5105-3506-
dc.identifier.citationAlattal, D. et al. (2025) 'Integrating Explainable AI in Medical Devices: Technical, Clinical and Regulatory Insights and Recommendations', arXiv:2505.06620v1 [cs.HC], [preprint], pp. 1 - 47. doi: 10.48550/arXiv.2505.06620.en_US
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/30251-
dc.descriptionA preprint version of the article is available at arXiv:2505.06620v1 [cs.HC], https://arxiv.org/abs/2505.06620, [v1] Sat, 10 May 2025 12:09:19 UTC (1,260 KB), under a CC BY license. It has not been certified by peer review.en_US
dc.descriptionAvailability of data and materials: CPRD cardiovascular disease synthetic dataset used in this paper can be requested from CPRD (https://cprd.com/cprdcardiovascular-disease-synthetic-dataset)-
dc.description.abstractThere is a growing demand for the use of Artificial Intelligence (AI) and Machine Learning (ML) in healthcare, particularly as clinical decision support systems to assist medical professionals. However, the complexity of many of these models, often referred to as black box models, raises concerns about their safe integration into clinical settings as it is difficult to understand how they arrived at their predictions. This paper discusses insights and recommendations derived from an expert working group convened by the UK Medicine and Healthcare products Regulatory Agency (MHRA). The group consisted of healthcare professionals, regulators, and data scientists, with a primary focus on evaluating the outputs from different AI algorithms in clinical decision-making contexts. Additionally, the group evaluated findings from a pilot study investigating clinicians' behaviour and interaction with AI methods during clinical diagnosis. Incorporating AI methods is crucial for ensuring the safety and trustworthiness of medical AI devices in clinical settings. Adequate training for stakeholders is essential to address potential issues, and further insights and recommendations for safely adopting AI systems in healthcare settings are provided.en_US
dc.description.sponsorshipThis work was funded by the Regulators Pioneer Fund 3, Department for Science, Innovation and Technology. The RPF is a grant-based fund to enable UK regulators and local authorities to help create a UK regulatory environment that encourages business innovation and growth. The current £12m round is being delivered by DSIT. This work was also supported by the UK Regulatory Science and Innovation Networks– Implementation Phase: Human Health CERSIs programme through the project RADIANT: Regulatory Science Empowering Innovation in Transformative Digital Health and AI (Grant Ref: MCPC24031), funded by the Medical Research Council (MRC) and Innovate UK.en_US
dc.format.extent1 - 47-
dc.format.mediumElectronic-
dc.language.isoenen_US
dc.publisherCornell Universityen_US
dc.rightsCreative Commons Attribution 4.0 International-
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.subjecteXplainable AIen_US
dc.subjectCDSSen_US
dc.subjectmedical devicesen_US
dc.subjectAI regulationen_US
dc.titleIntegrating Explainable AI in Medical Devices: Technical, Clinical and Regulatory Insights and Recommendationsen_US
dc.typeArticleen_US
dc.identifier.doihttps://doi.org/10.48550/arXiv.2505.06620-
dc.relation.isPartOfarXiv-
pubs.volume0-
dc.identifier.eissn2331-8422-
dc.rights.licensehttps://creativecommons.org/licenses/by/4.0/legalcode.en-
dc.rights.holderThe Author(s)-
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
Preprint.pdfCopyright © 2025 The Author(s). This work is licensed under a Creative Commons Attribution 4.0 International License (https://creativecommons.org/licenses/by/4.0/).785.78 kBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons