首页|一种改进的YOLOv7-OBB舰船识别方法

一种改进的YOLOv7-OBB舰船识别方法

扫码查看
为解决高分辨率遥感图像中舰船识别准确率低的问题,提出了一种改进的YOLOv7-OBB舰船识别方法.引入定向检测框OBB(oriented bounding box)和KLD损失,可有效解决舰船密集排列和比例细长且方向任意所产生的漏检问题,在提高定位精度的同时保留了船只的目标方向信息;在YOLOv7 基础框架的主干网络加入混合注意力模块ACmix,加强网络对于小目标检测的敏感度,能够提升对小型船只的检测精度;在颈部加入全局注意力机制(NAMAt-tention)和Partial卷积(PConv),在保证模型轻量化的同时,可提高PAN网络在复杂背景中捕捉关键特征的能力.实验结果表明,与YOLOv7 模型相比,该方法在DOTAships 数据集上取得了 88.5%的平均精度,93.0%的准确率,84.7%的召回率,分别比YOLOv7 提高了5%,0.9%和3.9%.与当前主流算法相比,该方法在检测效果上有着明显提升.
An improved YOLOv7-OBB ship identification method
To solve the problem of low ship identification accuracy in high-resolution remote sensing images,an improved YOLOv7-OBB ship identification method is proposed.The introduction of directional detection frame OBB(Oriented Bounding Box)and KLD loss can effectively solve the problem of missing detection caused by the dense arrangement of ships,slender proportions and arbitrary directions,and retain the target direction information of ships while improving positioning accuracy.The hybrid attention module ACmix is added to the backbone network of the YOlOv 7 basic framework to enhance the sensitivity of the network to small target detection and improve the detection accuracy of small vessels.Adding global attention mechanism(NAMAttention)and Partial convolution(PConv)to the neck can improve the ability of PAN networks to capture key features in complex backgrounds while ensuring model lightweight.Experimental results show that compared with the YOLOv 7 model,our method achieves 88.5%average accuracy,93.0%accuracy,and 84.7%recall on the DOTAships dataset,which is 5%,0.9%,and 3.9%higher than YOLOv7,respectively.It can be proved that compared with the current mainstream algorithms,the method has a significant improvement in detection effect.

YOLOv7-OBB algorithmship identificationoriented bounding boxACmixNAMAttentionPConv

孙宏磊、陈雯柏、刘辉翔

展开 >

北京信息科技大学自动化学院,北京 100192

YOLOv7-OBB算法 舰船识别 定向检测框 混合注意力模块 全局注意力机制 Partial卷积

国家自然科学基金

62276028

2024

兵器装备工程学报
重庆市(四川省)兵工学会 重庆理工大学

兵器装备工程学报

CSTPCD北大核心
影响因子:0.478
ISSN:2096-2304
年,卷(期):2024.45(8)