首页|多尺度特征融合的遥感图像目标检测方法

多尺度特征融合的遥感图像目标检测方法

扫码查看
遥感图像目标检测是计算机视觉领域中的一个重要研究方向,广泛运用在军事和民用领域.遥感图像中的目标具有尺度多样、密集排列和类间相似等特点,使得用于自然图像的目标检测方法在遥感图像目标检测中存在较多漏检和误检等现象.针对这一问题,在YOLOv5的基础上,提出一种基于多尺度特征融合的遥感图像目标检测方法.首先,在骨干网中引入融合多头自注意力的残差单元,通过该模块充分提取多层次特征信息,缩小不同尺度间的语义差异;其次,引入融合轻量级上采样算子的特征金字塔网络,用于获取高层语义特征和低层细节特征,通过特征融合的方式获得特征信息更丰富的特征图,从而提升不同尺度目标的特征分辨率.在公开数据集DOTA和NWPU VHR-10上评估了所提方法的有效性,相比基准模型,所提方法的准确率(mAP)分别提高了 1.5%和2.0%.
Object Detection Method with Multi-scale Feature Fusion for Remote Sensing Images
Object detection for remote sensing images is an important research direction in the field of computer vision,which is widely used in military and civil fields.The objects in remote sensing images have the characteristics of multiple scales,dense ar-rangement and similarity between classes,so that the object detection methods used in natural images have many omissions and false detection in remote sensing images.To address this problem,this paper proposes an object detection method with multi-scale feature fusion based on YOLOv5 for remote sensing images.Firstly,a residual unit fusing multi-head self-attention is introduced into the backbone network,through which multi-level feature information is fully extracted and semantic differences among diffe-rent scales were reduced.Secondly,a feature pyramid network fusing lightweight upsampling operators is introduced for obtaining high level semantic features and low-level detail ones.And the feature maps with richer feature information could be acquired by feature fusion,which improves the feature resolution of objects at different scales.The performance of the proposed method is evaluated on the datasets DOTA and NWPU VHR-10,and the accuracy(mAP)of the method isimproved by 1.5%and 2.0%,respectively,compared with the baseline model.

Remote sensing imagesObject detectionMulti-scale featuresFeature fusionYOLOv5

张洋、夏英

展开 >

重庆邮电大学计算机科学与技术学院 重庆 400065

遥感图像 目标检测 多尺度特征 特征融合 YOLOv5

国家自然科学基金重庆市教委重点合作项目

41871226HZ2021008

2024

计算机科学
重庆西南信息有限公司(原科技部西南信息中心)

计算机科学

CSTPCD北大核心
影响因子:0.944
ISSN:1002-137X
年,卷(期):2024.51(3)
  • 27