Please use this identifier to cite or link to this item:
http://bura.brunel.ac.uk/handle/2438/31089
Title: | Leveraging BiLSTM-GAT for enhanced stock market prediction: a dual-graph approach to portfolio optimization |
Authors: | Lu, X Poon, J Khushi, M |
Keywords: | attention mechanism;BiLSTM;graph attention network;portfolio management;stock selection;technical indicators |
Issue Date: | 31-Mar-2025 |
Publisher: | Springer Nature |
Citation: | Lu, X., Poon, J. and Khushi, M. (2025) 'Leveraging BiLSTM-GAT for enhanced stock market prediction: a dual-graph approach to portfolio optimization', Applied Intelligence, 55 (7), 601, pp. 1 - 18. doi: 10.1007/s10489-025-06462-w. |
Abstract: | Stock price prediction remains a critical challenge in financial research due to its potential to inform strategic decision-making. Existing approaches predominantly focus on two key tasks: (1) regression, which forecasts future stock prices, and (2) classification, which identifies trading signals such as buy, sell, or hold. However, the inherent limitations of financial data hinder effective model training, often leading to suboptimal performance. To mitigate this issue, prior studies have expanded datasets by aggregating historical data from multiple companies. This strategy, however, fails to account for the unique characteristics and interdependencies among individual stocks, thereby reducing predictive accuracy. To address these limitations, we propose a novel BiLSTM-GAT-AM model that integrates bidirectional long short-term memory (BiLSTM) networks with graph attention networks (GAT) and an attention mechanism (AM). Unlike conventional graph-based models that define edges based solely on technical or fundamental relationships, our approach employs a dual-graph structure: one graph captures technical similarities, while the other encodes fundamental industry relationships. These two representations are aligned through an attention mechanism, enabling the model to exploit both technical and fundamental insights for enhanced stock market predictions. We conduct extensive experiments, including ablation studies and comparative evaluations against baseline models. The results demonstrate that our model achieves superior predictive performance. Furthermore, leveraging the model’s forecasts, we construct an optimized portfolio and conduct backtesting on the test dataset. Empirical results indicate that our portfolio consistently outperforms both baseline models and the S&P 500 index, highlighting the effectiveness of our approach in stock market prediction and portfolio optimization. |
Description: | Data Availability: The data used in this research were obtained from a publicly available dataset from Yahoo Finance. |
URI: | https://bura.brunel.ac.uk/handle/2438/31089 |
DOI: | https://doi.org/10.1007/s10489-025-06462-w |
ISSN: | 0924-669X |
Other Identifiers: | ORCiD: Matloob Khushi https://orcid.org/0000-0001-7792-2327 Article number 601 |
Appears in Collections: | Dept of Computer Science Research Papers |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FullText.pdf | Copyright © The Author(s) 2025. Rights and permissions: Open Access. This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit https://creativecommons.org/licenses/by/4.0/. | 844.89 kB | Adobe PDF | View/Open |
This item is licensed under a Creative Commons License