基于改进Transformer的电力负载预测
Power load forecasting based on improved Transformer
秦喜文 1唐英杰 1董小刚 1朱妍霏2
作者信息
- 1. 长春工业大学大数据科学研究院,吉林长春 130012;长春工业大学数学与统计学院,吉林长春 130012
- 2. 东北师范大学政法学院,吉林长春 130024
- 折叠
摘要
针对电力负载预测任务,提出了一种改进的Transformer模型.使用全连接层替换原来的解码器结构,在降低模型复杂度的同时使模型更加契合电力负载数据,使用Adam W方法优化了深度学习中普遍存在的权重衰减处理上的缺陷.实验结果表明,在洛杉矶、纽约和萨克拉门托三个城市的真实电力负载数据集上,相较于ELM、RNN、LSTM和传统的Transformer模型,改进的Transformer模型可以更准确地进行电力负载预测.
Abstract
This paper proposes a Transformer model for the power load forecasting task.The fully connected layer is used to replace the original decoder structure,which reduces the complexity of the model and makes the model more suitable for the power load data.The AdamW optimization method is used to optimize the defects of weight decay processing that are common in deep learning.Experimental results show that,compared with ELM,RNN,LSTM and traditional Transformer models,the improved Transformer model can more accurately predict power loads on the real power load datasets of Los Angeles,New York and Sacramento.
关键词
Transformer/自注意力机制/电力负载预测/位置编码Key words
Transformer/self-attention mechanism/power load forecasting/position encoding引用本文复制引用
出版年
2024