A2former模型在时间序列预测中的应用研究
A Study on the Application of A2former Model in Time Series Forecasting
胡倩伟 1王秀青 2安阳 1张诺飞 1王广超1
作者信息
- 1. 河北师范大学 计算机与网络空间安全学院,石家庄 050024;河北师范大学 河北省网络与信息安全重点实验室,石家庄 050024
- 2. 河北师范大学 计算机与网络空间安全学院,石家庄 050024;河北师范大学 河北省网络与信息安全重点实验室,石家庄 050024;河北师范大学 河北省供应链大数据分析与数据安全工程研究中心,石家庄 050024
- 折叠
摘要
时间序列预测在金融、医疗、交通和气象等领域发挥着重要作用.在长时间序列预测中,迫切需要提高预测的精度,解决内存不足等问题.近年来,Transformer模型在自然语言处理领域得以成功应用的同时,在预测研究领域也引起了学者们的广泛关注,Transformer 变体 Informer 模型的研究在时间序列预测中取得了较大进展.本研究以 Informer框架为基础,与加性注意力机制相结合,提出了 A2 former 模型.利用 A2 former 模型在 ETT,WTH,ECL和PM2.5 数据集上进行了长时间序列预测的实验,实验结果表明所提模型在长时间序列预测中表现出比基线方法(如 Informer模型和 LSTMa模型)更好的性能.A2 former模型不仅将计算时间复杂度降低到线性,而且可以实现更有效的序列建模.本研究的工作为时间序列预测提供了有益参考.
Abstract
Time series forecasting plays an important role in the fields of finance,medicine,transportation and meteorology.In long sequence time-series forecasting(LSTF),it is urgent to improve the forecast accuracy and solve the problems of insuffi-cient memory.In recent years,the successful application of Transformer in natural language processing has also attracted a lot of attention forecasting studies.Informer model,a variant of Transformer,has made great progress in time series forecasting.In this paper,we proposed a A2 former model,which is based on Informer and additive attention mechanism.The A2 former model was experimented on ETT,WTH,ECL and PM2.5 datasets for LSTF.Experimental results show that A2former exhib-its better performance than existing baseline methods(e.g.,LSTMa and Informer)in LSTF.A2 former not only reduces time computational complexity to linearity and improves forecast accuracy,but also enables more efficient sequence modeling,our work provides a valuable input for time series forecasting.
关键词
时间序列预测/加性注意力机制/Transformer模型/Informer模型/深度学习Key words
time series forecasting/additive attention mechanism/Transformer model/Informer model/deep learning引用本文复制引用
基金项目
国家自然科学基金项目(61673160)
河北省自然科学基金项目(F2018205102)
河北省高等学校科学技术研究重点项目(ZD2021063)
出版年
2024