首页|基于改进YOLOv5s的遥感图像目标检测方法

基于改进YOLOv5s的遥感图像目标检测方法

扫码查看
针对遥感图像目标检测的小目标排列密集、背景区域复杂的问题,对YOLOv5s模型进行改进.主干网络采用包含深度可分离卷积的协调注意力(CA)模块,引入通道与空间的多维注意力机制,挖掘空间方向与位置相关性,提高特征提取能力和长距离依赖关系捕捉能力;颈部网络采用双向特征金字塔网络(BiFPN)结构,充分融合深层与浅层特征信息,提高不同尺度特征融合效果.实验表明,针对遥感目标DIOR数据集,相较于改进前模型的结果:改进后模型的均值平均精度(mAP)提高9.8百分点;各类平均精度(AP)均有提升,大部分类别平均准确率提高5百分点以上;网络精确率提高了7.2百分点,召回率提高了10.8百分点,缓解漏检、误检问题,增强了模型对遥感图像中复杂背景下密集小目标的检测效果.
Remote Sensing Object Detection Methods Based on Improved YOLOv5s
In order to solve the problems of dense arrangement of small targets and complex background area in remote sensing image object detection,the YOLOv5s model is improved.Backbone network adopts coordinated attention(CA)module with deep separable convolution,introduces the multi-dimensional attention mechanism of channel and space,mines the correlation between spatial direction and position,and improves the ability of feature extraction and long-distance dependency capture.Neck network uses bidirectional feature pyramid network(BiFPN)structure to fully integrate the deep and shallow feature information to improve the feature fusion effect at different scales.Experimental results show that,for the remote sensing target dataset DIOR,compared with results of modle before improved,mean average precision(mAP)of the model is increased by 9.8 percentage points after the improvement.Average precision(AP)of all categories has been improved,and the value of most categories has increased by more than 5 percentage points.The precision is increased by 7.2 percentage points,the recall is increased by 10.8 percentage points,which alleviates the problems of missed detection and false detection,and enhances the detection effect of the model on dense small targets in complex backgrounds in remote sensing images.

machine visionremote sensing imageobject detectionYOLOv5sattention mechanisms

程凯伦、胡晓兵、陈海军、李虎

展开 >

四川大学机械工程学院,四川 成都 610065

宜宾四川大学产业技术研究院,四川 宜宾 644005

机器视觉 遥感图像 目标检测 YOLOv5s 注意力机制

四川大学-宜宾校市战略合作项目

2020CDYB-3

2024

激光与光电子学进展
中国科学院上海光学精密机械研究所

激光与光电子学进展

CSTPCD北大核心
影响因子:1.153
ISSN:1006-4125
年,卷(期):2024.61(18)
  • 8