首页|基于自注意力机制与1D-CNN的变压器故障诊断方法

基于自注意力机制与1D-CNN的变压器故障诊断方法

扫码查看
目的 变压器是电力系统中重要的设备,其发生故障时能够被有效地判别出故障类别,使得电力检修效率提升,这对电网的安全运行具有重要意义。针对电网电力检修中出现的变压器故障判别精度不足这一问题,提出了基于自注意力机制与1D-CNN的变压器故障诊断方法。常规卷积在处理DGA气体样本数据时容易损失特征信息,导致故障诊断的准确率偏低,论文将自注意力机制与1D-CNN结合,有效改善了上述问题,提高了变压器故障诊断的准确率和可靠性。方法 为减少卷积网络提取到的特征提取信息在模型层间传播时造成的损失,论文在1D-CNN的基础上使用LeakyReLU函数替代原模型中的ReLU激活函数,相比于ReLU激活方式下很多神经元都没有被激活,LeakyReLU可以降低模型的稀疏性,使得网络特征信息多样性增加。自注意力机制可实现对变压器油中溶解气体数据的特征信息加权处理,实现了有效特征信息增强作用,采用动态衰减学习率策略对优化器进行优化。结果 所提的方法损失率可降低至0。078,相比于无动态衰减学习率和ReLU激活方式,损失率分别降低了 44。7%和38。6%;诊断准确率可达到93。79%,较1D-CNN和GOA-BP方法诊断准确率提高了 0。36%和2。12%。结论 算例仿真验证了所提方法的有效性和优越性,表明基于自注意力机制与1D-CNN的变压器故障诊断方法能有效提高诊断的准确率,降低模型的损失率。
Transformer Fault Diagnosis Method Based on Self-attention Mechanism and 1D-CNN
Objective Transformers are important equipment in the power system.Effective identification of fault categories when transformers fail can improve the efficiency of power maintenance,which is of great significance for the safe operation of the power grid.In response to low accuracy in transformer fault identification in power grid maintenance,this paper proposed a transformer fault diagnosis method based on a self-attention mechanism and 1D-CNN.Conventional convolution often loses feature information when processing DGA gas sample data,resulting in low accuracy in fault diagnosis.By combining the self-attention mechanism with 1D-CNN,this paper effectively addressed the above issues,improving the accuracy and reliability of transformer fault diagnosis.Methods To reduce the loss of feature extraction information during inter-layer propagation in convolutional networks,this paper replaced the ReLU activation function in the original model with the LeakyReLU function.Compared with ReLU activation,where many neurons are not activated,LeakyReLU can reduce the sparsity of the model and increase the diversity of network feature information.The self-attention mechanism can weight the feature information of dissolved gas data in transformer oil,realizing effective feature information enhancement.A dynamic decay learning rate strategy was used to optimize the optimizer.Results The proposed method reduced the loss rate to 0.078,a decrease of 44.7%and 38.6%compared with models without dynamic decay learning rate and ReLU activation,respectively.The diagnostic accuracy can reach 93.79%,an increase of 0.36%and 2.12%compared with 1D-CNN and GOA-BP methods,respectively.Conclusion Case study simulations validate the effectiveness and superiority of the proposed method,demonstrating that the transformer fault diagnosis method based on the self-attention mechanism and 1D-CNN can effectively improve diagnostic accuracy and reduce model loss.

transformerfault diagnosisself-attention mechanismconvolutional neural network

刘国柱

展开 >

安徽理工大学电气与信息工程学院,安徽淮南 232001

变压器 故障诊断 自注意力机制 卷积神经网络

2025

重庆工商大学学报(自然科学版)
重庆工商大学

重庆工商大学学报(自然科学版)

影响因子:0.548
ISSN:1672-058X
年,卷(期):2025.42(1)