Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/32298
Full metadata record
DC FieldValueLanguage
dc.contributor.authorAhmed, A-
dc.coverage.spatialLondon, UK-
dc.date.accessioned2025-11-06T13:56:40Z-
dc.date.available2025-11-06T13:56:40Z-
dc.date.issued2025-10-01-
dc.identifierArticle number: 07008-
dc.identifier.citationAhmed, A. (2025 'Hybrid and deep learning architectures for predictive maintenance: Evaluating LSTM, and attention-based LSTM-XGBoost on turbofan engine RUL', MATEC Web of Conferences, 413, 07008, pp. 1 - 8. doi: 10.1051/matecconf/202541307008.en_US
dc.identifier.issn2274-7214-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/32298-
dc.description.abstractAccurate prediction of a machines Remaining Useful Life (RUL) underpins modern, costeffective predictive-maintenance programmes. This paper proposes a two-stage hybrid pipeline that couples sequence learning with tree-based residual modelling. In stage 1, 50-cycle windows of NASA C-MAPSS sensor data (FD001 and FD004 subsets) are processed by a bi-layer Long Short-Term Memory (LSTM) network equipped with an attention mechanism; attention weights highlight degradation-relevant time steps and yield a compact, interpretable context vector. In stage 2, this vector is concatenated with four statistical descriptors (mean, standard deviation, minimum, maximum) of each window and passed to an extreme gradient-boosted decision-tree regressor (XGBoost) tuned via grid search. Identical preprocessing and earlystopping schedules are applied to a baseline LSTM for fair comparison. The attention-LSTM–XGBoost model lowers Mean Absolute Error (MAE) by 9.8 % on FD001 and 7.4 % on the more challenging FD004, and reduces Root Mean Squared Error (RMSE) by 8.1 % and 5.6 %, respectively, relative to the baseline. Gains on FD004 demonstrate robustness to multiple fault modes and six operating regimes. By combining temporal attention with gradient-boosted residual fitting, the proposed architecture delivers state-of-the-art accuracy while retaining feature-level interpretability, an asset for safety-critical maintenance planning.en_US
dc.format.extent1 - 8-
dc.format.mediumPrint-Electronic-
dc.language.isoenen_US
dc.publisherEDP Sciencesen_US
dc.rightsCreative Commons Attribution 4.0 International-
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/-
dc.sourceInternational Conference on Measurement, AI, Quality and Sustainability (MAIQS 2025)-
dc.sourceInternational Conference on Measurement, AI, Quality and Sustainability (MAIQS 2025)-
dc.titleHybrid and deep learning architectures for predictive maintenance: Evaluating LSTM, and attention-based LSTM-XGBoost on turbofan engine RULen_US
dc.typeConference Paperen_US
dc.date.dateAccepted2025-06-08-
pubs.finish-date2025-08-28-
pubs.finish-date2025-08-28-
pubs.start-date2025-08-26-
pubs.start-date2025-08-26-
pubs.volume413-
dc.identifier.eissn2261-236X-
dc.rights.licensehttps://creativecommons.org/licenses/by/4.0/legalcode.en-
dcterms.dateAccepted2025-06-08-
dc.rights.holderThe Authors-
Appears in Collections:Mechanical and Aerospace Engineering
Dept of Mechanical and Aerospace Engineering Research Papers

Files in This Item:
File Description SizeFormat 
FiullText.pdfCopyright © The Authors, published by EDP Sciences, 2025. Licence: Creative Commons. This is an Open Access article distributed under the terms of the Creative Commons Attribution License 4.0 (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.569.39 kBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons