基于CBAMs-BiLSTM模型的中国股市预测
A novel CBAMs-BiLSTM model for Chinese stock market forecasting
崔晨豪 1李勇1
作者信息
- 1. 中国科学技术大学管理学院, 安徽合肥 230026
- 折叠
摘要
卷积块注意力模块(CBAM)因其可以有效地提高深度学习模型的预测准确性从而在各种预测问题中显示了其优越性.然而,CBAM在股指预测问题中的有效性研究却十分有限.为了解决这个问题并提高股指的预测精度,本文提出了CBAMs-BiLSTM模型.它将多个CBAM与双向长短期记忆网络(BiLSTM)相结合.研究中,标准指标评价法(SME)和模型置信集检验(MCS)用于综合评价模型的优越性和稳健性.实验数据为具有代表性的中国股指数据集:上证综合指数和深证综合指数.数值结果表明,CBAMs-BiLSTM优于单独的BiLSTM.其中在MAE,RMSE和 MAPE上分别平均降低了 13.06%,13.39%和12.48%.这证实了CBAM可以有效地提高BiLSTM的预测精度.此外,通过与其他流行模型进行对比,并研究了改变数据集、预测方法和训练集的大小的影响.结果一致证实了CBAMs-BiLSTM在预测精度和投资回报方面的优越性和稳健性.
Abstract
The convolutional block attention module(CBAM)has demonstrated its superiority in various prediction prob-lems,as it effectively enhances the prediction accuracy of deep learning models.However,there has been limited research testing the effectiveness of CBAM in predicting stock indexes.To fill this gap and improve the prediction accuracy of stock indexes,we propose a novel model called CBAMs-BiLSTM,which combines multiple CBAM modules with a bid-irectional long short-term memory network(BiLSTM).In this study,we employ the standard metric evaluation method(SME)and the model confidence set test(MCS)to comprehensively evaluate the superiority and robustness of our model.We utilize two representative Chinese stock index data sets,namely,the SSE Composite Index and the SZSE Composite Index,as our experimental data.The numerical results demonstrate that CBAMs-BiLSTM outperforms BiLSTM alone,achieving average reductions of 13.06%,13.39%,and 12.48%in MAE,RMSE,and MAPE,respectively.These findings confirm that CBAM can effectively enhance the prediction accuracy of BiLSTM.Furthermore,we compare our proposed model with other popular models and examine the impact of changing data sets,prediction methods,and the size of the training set.The results consistently demonstrate the superiority and robustness of our proposed model in terms of predic-tion accuracy and investment returns.
关键词
股指预测/双向长短期记忆网络/卷积块注意力模块/模型置信集检验/标准指标评价法Key words
stock index prediction/BiLSTM/CBAM/MCS/SME引用本文复制引用
出版年
2024