首页|基于Transformer与改进记忆机制的用电量预测研究

基于Transformer与改进记忆机制的用电量预测研究

扫码查看
近年来我国经济的高速发展对电力配置提出了更高要求,实现电力资源的高效配置需要更加精准的用电量预测.随着人工智能、机器学习等技术的发展,高效精准的用电量预测成为可能.目前该领域普遍使用Long Short-Term Memory(LSTM)及其变种模型,但准确度相对较低.文中提出了一种基于改进记忆机制与Transformer的用电量预测模型,使用Transformer编码输入,提出了一种新型记忆机制来执行预测.实验表明该方法相较随机森林回归和LSTM及其变种模型,一周内平均误差分别下降9.05%与5.32%,模型收敛速度更快且具有较好的泛化性能.
Research on electricity consumption prediction based on Transformer and Improved memory mech-anism
In recent years,the rapid development of our country's economy has put forward higher require-ments for power allocation.To achieve efficient allocation of power resources requires more accurate power consumption forecasting.With the development of artificial intelligence,machine learning and other tech-nologies,efficient and accurate electricity consumption forecasting becomes possible.At present,Long Short-Term Memory(LSTM)and its variant models are commonly used in this field,but the accuracy of this method is relatively low.This paper proposes a power consumption prediction model based on improved memory mechanism and Transformer.The method uses a Transformer to encode the input and proposes a novel memory mechanism to realize predictions.Experiments show that compared with random forest regres-sion and LSTM and its variant models,the average error of this method decreases by 9.05%and 5.32%respectively within one week,and the model converges faster and has better generalization performance.

memory networkTransformertime series predictionmachine learningLong Short-Term Memory

蔡岳、张津铭、郭晶、徐玉华、孙知信

展开 >

南京邮电大学现代邮政学院,江苏省邮政大数据技术与应用工程研究中心,国家邮政局邮政行业技术研发中心(物联网技术),宽带无线通信与传感网技术教育部重点实验室,南京 210000

国网信息通信产业集团,北京 102211

四川中电启明星信息技术有限公司,成都 610041

记忆网络 Transformer 时序预测 机器学习 长短期记忆

国家自然科学基金

61972208

2024

信息技术
黑龙江省信息技术学会 中国电子信息产业发展研究院 中国信息产业部电子信息中心

信息技术

CSTPCD
影响因子:0.413
ISSN:1009-2552
年,卷(期):2024.(6)