首页|基于Halo注意力机制的双阶段临近降水预报网络

基于Halo注意力机制的双阶段临近降水预报网络

扫码查看
先前基于深度学习进行临近降水预报的方法试图在统一架构中建模雷达回波的时空演变,然而,这些方法可能难以完全捕捉到这种复杂的时空关系.本文提出了一种基于Halo注意力机制的双阶段临近降水预报网络,该网络将降水预测的时空演变过程分为运动趋势预测和空间外观重建两个阶段.首先,可学习光流模块对雷达回波的运动趋势进行建模并生成粗略的预测结果.其次,特征重建模块对历史雷达回波序列的空间外观变化建模并对粗粒度预测结果的空间外观进行特征细化重建,生成精细的雷达回波图.通过在CIKM数据集上的实验表明,本文所提出的方法与主流方法相比,平均的海德克技能得分和关键成功指数分别提高了 4.60%和 3.63%,达到了 0.48 和0.45;结构相似性提高了4.84%,达0.52;均方误差降低了6.13%,达70.23.
Two-stage Precipitation Nowcasting Network Based on Halo Attention Mechanism
The previous methods for precipitation nowcasting based on deep learning try to model the spatiotemporal evolution of radar echoes in a unified architecture.However,these methods may face difficulty in capturing the complex spatiotemporal relationships completely.This study proposes a two-stage precipitation nowcasting network based on the Halo attention mechanism.This network divides the spatiotemporal evolution process of precipitation nowcasting into two stages:motion trend prediction and spatial appearance reconstruction.Firstly,a learnable optical flow module models the motion trend of radar echoes and generates coarse prediction results.Secondly,a feature reconstruction module models the spatial appearance changes in the historical radar echo sequences and refines the spatial appearance of the coarse-grained prediction results,generating fine-grained radar echo maps.The experimental results on the CIKM dataset demonstrate that the proposed method outperforms mainstream methods.The average Heidke skill score and critical success index are improved by 4.60%and 3.63%,reaching 0.48 and 0.45,respectively.The structural similarity index is improved by 4.84%,reaching 0.52,and the mean squared error is reduced by 6.13%,reaching 70.23.

deep learningprecipitation nowcastingoptical flowattention mechanismtwo-stage prediction

周云龙、季繁繁、潘泽锋

展开 >

南京信息工程大学计算机学院,南京 210044

南京信息工程大学电子与信息工程学院,南京 210044

深度学习 临近降水预报 光流 注意力机制 双阶段预测

科技创新2030新一代人工智能重大项目国家自然科学基金国家自然科学基金

2018AAA0100400U21B204961936005

2024

计算机系统应用
中国科学院软件研究所

计算机系统应用

CSTPCD
影响因子:0.449
ISSN:1003-3254
年,卷(期):2024.33(5)
  • 25