首页|基于多感受野与动态特征细化的遥感图像检测算法

基于多感受野与动态特征细化的遥感图像检测算法

扫码查看
为解决遥感图像目标检测任务中的目标尺寸过小、尺度变化剧烈、目标聚集和背景复杂导致的检测精度较低的问题,提出了一种基于YOLOv7的改进算法DF-YOLOv7.首先去除YOLOv7中过度下采样导致信息丢失的策略,通过改进层结构,以提高对小物体的检测精度,并且轻量化网络模型.其次提出多感受野的MRELAN模块替换部分ELAN,获得更强的多尺度特征表示,嵌入跨空间学习的高效多尺度注意力机制,适应复杂的场景.最后,提出上下文动态特征细化模块,过滤冗余的特征来突出低层小目标信息的特征差,优化密集目标表达能力.在VisDrone2019和DOTA数据集上对不同算法进行对比实验,实验结果表明:改进后的算法明显优于其他主流算法,相比YOLOv7,在两个数据集上的精度分别提高了3.3百分点和2.3百分点,参数量下降了50.8%.相比于YOLOv5s,该算法在VisDrone上提高了20.1百分点.
Remote Sensing Image Detection Algorithm Based on Multi-receptive Field and Dynamic Feature Refinement
To solve the problems of low detection accuracy caused by small target size,drastic scale changes,target aggregation,and complex backgrounds in remote sensing image target detection tasks,an enhanced algorithm,DF-YOLOv7,based on YOLOv7,was put forward.This algorithm first enhances the information loss strategy caused by excessive downsampling in YOLOv7,improves the detection accuracy of small objects by modifying the layer structure,and lightens the network model.Secondly,the MRELAN module with multi-receptive fields is proposed to replace part of the ELAN to obtain a more robust multi-scale feature representation and to embed the efficient multi-scale attention mechanism,for cross-spatial learning to adapt to complex scenes.Finally,a contextual dynamic feature refinement module was presented,and the redundant features were filtered to highlight the feature differences of low-level small target information and improve the ability to express dense targets.Compared with YOLOv7,the accuracy of the improved algorithm is increased by 3.3 percentage points and 2.3 percentage points,respectively,and the number of parameters is reduced by 50.8%.Compared to YOLOv5s,it achieves 20.1 percentage points higher on VisDrone.

remote sensingsmall target detectiondeep learningYOLOv7feature refinement

黄骏、郭颖、严舒

展开 >

南京信息工程大学江苏省大气环境与装备技术协同创新中心,江苏 南京 210044

南京信息工程大学自动化学院,江苏 南京 210044

遥感 小目标检测 深度学习 YOLOv7 特征细化

2024

激光与光电子学进展
中国科学院上海光学精密机械研究所

激光与光电子学进展

CSTPCD北大核心
影响因子:1.153
ISSN:1006-4125
年,卷(期):2024.61(22)