Please use this identifier to cite or link to this item: http://bura.brunel.ac.uk/handle/2438/32906
Full metadata record
DC FieldValueLanguage
dc.contributor.authorXing, L-
dc.contributor.authorKaheh, Z-
dc.date.accessioned2026-02-28T11:04:56Z-
dc.date.available2026-02-28T11:04:56Z-
dc.date.issued2026-02-24-
dc.identifierORCiD: Zohreh Kaheh https://orcid.org/0000-0002-8518-8545-
dc.identifier.citationXing, L. and Kaheh, Z. (2026) 'Fair Benchmarking in Short‐Term Load Forecasting', Artificial Intelligence for Engineering, 0 (ahead of print), pp. 1 - 14. doi: 10.1049/aie2.70011.en-GB
dc.identifier.issn3067-249X-
dc.identifier.urihttps://bura.brunel.ac.uk/handle/2438/32906-
dc.descriptionData Availability Statement: The data that support the findings of this study are available from the corresponding author upon reasonable request.en-GB
dc.description.abstractPerformance comparisons in short-term load forecasting are often confounded by differences in preprocessing pipelines rather than reflecting intrinsic architectural capability. Variations in feature engineering, scaling, temporal windowing and data partitioning can dominate reported accuracy and obscure the actual behaviour of forecasting models. This study examines preprocessing–architecture interaction by benchmarking random forest, LightGBM, long short-term memory (LSTM), transformer and Temporal Fusion Transformer (TFT) under a shared tabular preprocessing pipeline, ensuring strict control over data handling and evaluation conditions. Under this controlled setting, tree-based models exhibit strong predictive performance, whereas deep sequence models experience substantial degradation when temporal continuity is not explicitly represented. To isolate architectural sensitivity from preprocessing effects, we further conduct a within-architecture analysis by retraining an identical LSTM under a sequence-aware pipeline aligned with its temporal inductive bias. This realignment yields an order-of-magnitude reduction in RMSE, demonstrating that preprocessing design is a first-order determinant of deep sequence model performance. The results establish a transparent and reproducible benchmarking framework and highlight the importance of aligning data representation with model assumptions when interpreting comparative performance in time series forecasting.en-GB
dc.description.sponsorshipThe authors have nothing to report.en-GB
dc.format.extent1 - 14-
dc.format.mediumPrint-Electronic-
dc.languageen-GB-
dc.language.isoenen-GB
dc.publisherWiley on behalf of Institution of Engineering and Technologyen-GB
dc.rightsCreative Commons Attribution-NonCommercial-NoDerivatives 4.0 International-
dc.rights.urihttps://creativecommons.org/licenses/by-nc-nd/4.0/-
dc.subjectdeep learningen-GB
dc.subjectelectricity demanden-GB
dc.subjectenergy analyticsen-GB
dc.subjectforecastingen-GB
dc.subjectmachine learning benchmarkingen-GB
dc.subjectreproducibilityen-GB
dc.subjecttemporal fusion transformeren-GB
dc.titleFair Benchmarking in Short‐Term Load Forecastingen-GB
dc.typeArticleen-GB
dc.date.dateAccepted2026-02-08-
dc.identifier.doihttps://doi.org/10.1049/aie2.70011-
dc.relation.isPartOfArtificial Intelligence for Engineering-
pubs.issue0-
pubs.publication-statusPublished online-
pubs.volume00-
dc.identifier.eissn3067-2481-
dc.rights.licensehttps://creativecommons.org/licenses/by-nc-nd/4.0/legalcode.en-
dcterms.dateAccepted2026-02-08-
dc.rights.holderThe Author(s)-
dc.contributor.orcidKaheh, Zohreh [0000-0002-8518-8545]-
Appears in Collections:Department of Mathematics Research Papers

Files in This Item:
File Description SizeFormat 
FullText.pdfCopyright © 2026 The Author(s). Artificial Intelligence for Engineering published by John Wiley & Sons Ltd on behalf of Institution of Engineering and Technology. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (https://creativecommons.org/licenses/by-nc-nd/4.0/), which permits use and distribution in any medium, provided the original work is properly cited, the use is non-commercial and no modifications or adaptations are made.1.54 MBAdobe PDFView/Open


This item is licensed under a Creative Commons License Creative Commons