首页|基于表观细粒度辨别网络的近海船舶目标检测方法

基于表观细粒度辨别网络的近海船舶目标检测方法

扫码查看
近海船舶目标检测是一项非常具有挑战性的任务,受到学者专家广泛关注.基于卷积神经网络(CNN)和注意力机制的检测器在近海船舶目标检测方面的应用取得了显著成就.然而,船舶目标检测存在着表观相似和背景干扰导致检测过程中出现误检的问题.为此,本文提出了一种用于Faster RCNN(更快的基于区域的卷积神经网络)的表观细粒度辨别的检测头模块.该模块包括类别细粒度分支和高效全维动态卷积定位分支.其中类别细粒度分支通过全局特征建模和灵活的感知范围来挖掘和利用类别细粒度辨别特征,高效全维动态卷积定位分支通过高效灵活的感知船舶边界信息来区分目标与背景,从而减少误检漏检问题.通过在近海船舶公开数据集Seaships7000 上进行实验验证,本文算法减少了误检漏检,提升了检测器性能.
Nearshore Ship Object Detection Method Based on Appearance Fine-grained Discrimination Network
Offshore ship object detection is a very challenging task and has received widespread attention from scholars and ex-perts.Detectors based on Convolutional Neural Networks(CNN)and attention mechanisms have made significant progress in off-shore ship object detection.However,the problem of false detection in the detection process is caused by the apparent similarity and background interference of ship targets.In order to solve this problem,this paper proposes a detection head module for fine-grained appearance discrimination implemented with Faster RCNN.This module includes a category fine-grained branch and an efficient full-dimensional dynamic convolution localization branch.The category fine-grained branch mines and utilizes category fine-grained identification features through global feature modeling and flexible perception range.The efficient omni-dimensional dy-namic convolution positioning branch distinguishes objects and backgrounds through the efficient and flexible perception of ship boundary information,thereby reducing false and missed detections.Through experimental verification on the offshore ship public dataset Seaships7000,the proposed algorithm reduces false detections and missed detections and improves detector performance.

Ship object detectionSimilar feature extractionApparent discriminationDynamic convolutionSelf-attention

闵令通、范子满、窦飞阳、吕勤毅、李鑫

展开 >

西北工业大学电子信息学院 西安 710072

船舶目标检测 类别细粒度 表观判别 全维动态卷积 自注意力

国家自然科学基金陕西省自然科学基础研究计划

622062212021JM-074

2024

遥测遥控
中国航天工业总公司第七0四研究所

遥测遥控

CSTPCD
影响因子:0.28
ISSN:
年,卷(期):2024.45(2)
  • 34