首页|Regformer:基于稀疏注意力的输油管道水力压降预测方法

Regformer:基于稀疏注意力的输油管道水力压降预测方法

扫码查看
输油管道水力压降预测对于输油管道的生产调控十分重要,目前机器学习方法将压降预测看作回归问题,然而管道水力计算受多种因素影响,传统的机器学习方法由训练集得到的固定权重难以泛化到更多的测试样例或真实的工程场景中.本文提出一种水力压降回归预测方法Regformer,该方法将稀疏注意力机制引入回归任务,在多头注意力的基础上设计一种平滑概率方法,并融入特征投影机制.在10个公共数据集上对7种主流方法进行对比实验分析,定性实验显示Regformer对于局部的突变有很好的拟合能力;水力压降预测实验表明自注意力方法对于多变量不确定性的回归任务具有显著的优势,尤其是对极端情况的处理体现了自适应回归参数的重要性,并且Regformer用了更少的计算量取得了比Transformer更好的性能,验证了本文提出的稀疏注意力和自适应特征投影在水力压降预测任务中的优越性.
Regformer:Hydraulic Prediction Model of Oil Pipeline Based on GS-XGBoost
Hydraulic pressure drop prediction is very important for production regulation of oil pipelines,and current machine learning methods regard pressure drop prediction as a regression problem,however,pipeline hydraulic calculation is affected by many factors,and the fixed weights obtained from the training set by traditional machine learning methods are difficult to general-ize to more test samples or real engineering scenarios.This paper proposes a hydraulic pressure drop regression prediction method,Regformer,which introduces a sparse attention mechanism into the regression task,designs a smoothing probability method based on multi-headed attention,and incorporates a feature projection mechanism.In a comparative experimental analy-sis with seven mainstream methods on 10 public data sets,qualitative experiments show that Regformer has good fitting ability for local mutations;experiments on hydraulic pressure drop prediction show that the self-attentive method has significant advan-tages for regression tasks with multivariate uncertainty,especially for extreme cases reflecting the importance of adaptive regres-sion parameters,and Regformer achieves better performance than Transformer with less computation,verifying the superiority of the proposed sparse attention and adaptive feature projection for the hydraulic pressure drop prediction task.

hydrodynamic predictionTransformerRegformerSelf-attentive

李亚平、王军防、余红梅、窦一民、肖媛、田继林

展开 >

国家管网集团东部原油储运有限公司,江苏 徐州 221008

中国石油大学(华东)计算机科学与技术学院,山东 青岛 266580

水力预测 Transformer Regformer 自注意力机制

国家自然科学基金重大项目山东省自然科学基金资助项目

51991365ZR2021MF082

2024

计算机与现代化
江西省计算机学会 江西省计算技术研究所

计算机与现代化

CSTPCD
影响因子:0.472
ISSN:1006-2475
年,卷(期):2024.(1)
  • 10