首页|基于改进Transformer的电力负载预测

基于改进Transformer的电力负载预测

扫码查看
针对电力负载预测任务,提出了一种改进的Transformer模型.使用全连接层替换原来的解码器结构,在降低模型复杂度的同时使模型更加契合电力负载数据,使用Adam W方法优化了深度学习中普遍存在的权重衰减处理上的缺陷.实验结果表明,在洛杉矶、纽约和萨克拉门托三个城市的真实电力负载数据集上,相较于ELM、RNN、LSTM和传统的Transformer模型,改进的Transformer模型可以更准确地进行电力负载预测.
Power load forecasting based on improved Transformer
This paper proposes a Transformer model for the power load forecasting task.The fully connected layer is used to replace the original decoder structure,which reduces the complexity of the model and makes the model more suitable for the power load data.The AdamW optimization method is used to optimize the defects of weight decay processing that are common in deep learning.Experimental results show that,compared with ELM,RNN,LSTM and traditional Transformer models,the improved Transformer model can more accurately predict power loads on the real power load datasets of Los Angeles,New York and Sacramento.

Transformerself-attention mechanismpower load forecastingposition encoding

秦喜文、唐英杰、董小刚、朱妍霏

展开 >

长春工业大学大数据科学研究院,吉林长春 130012

长春工业大学数学与统计学院,吉林长春 130012

东北师范大学政法学院,吉林长春 130024

Transformer 自注意力机制 电力负载预测 位置编码

2024

长春工业大学学报
长春工业大学

长春工业大学学报

影响因子:0.282
ISSN:1674-1374
年,卷(期):2024.45(5)