首页|面向暗光场景的目标偏振/可见光融合检测方法

面向暗光场景的目标偏振/可见光融合检测方法

扫码查看
为解决偏振暗光场景下常见目标识别结果准确性不高的问题,提出了基于卷积神经网络的偏振度图像与可见光图像融合算法,设计了新的损失函数以形成无监督学习过程,引入拉普拉斯算子提高融合图像的质量,最终将被测目标的偏振信息与可见光信息有效结合;提出了基于改进的YOLOv5算法对融合后的目标进行目标检测,在网络框架中加入CA注意力机制,将通道注意力机制与空间注意力机制相结合;在自制的数据集上对提出的网络进行训练测试,结果表明,融合图像在主客观上都达到了较好的视觉效果,将改进的YOLOv5算法相比最优的YOLOv5s模型,精确率和召回率分别达到了89。3%和82。5%,均值平均精度分别提高了2。6%和1。8%。
Target Polarization/Visible Light Fusion Detection Method for Dark Light Scenes
In order to solve the low accuracy of common target recognition results in polarized dark light scenes,this paper proposes a fu-sion algorithm based on convolutional neural network of polarization degree images and visible light images,designs a new loss function to form an unsupervised learning process,introduces Laplace operator to improve the quality of fused image,and finally the polarization information of the target to be measured is effectively combined with the visible light information;A fused target detection algorithm is proposed based on the improved YOLOv5 algorithm for the target detection of the fused target,the coordinate attention(CA)attention mechanism of the network framework is added to combine the channel attention mechanism with the spatial attention mechanism.The proposed network is trained and tested on a homemade dataset,the results show that the fused image achieves better visual effects subjectively and objectively,compared with the optimal YOLOv5s model,the precision and recall rate of the improved YOLOv5 algorithm reach 89.3%and 82.5%,respectively,and the mean average precisions increase by 2.6%and 1.8%,respectively.

polarization imagingtarget detectionimage fusionYOLOv5image enhancementattention mechanism

马如钺、王晨光、曹慧亮、申冲、唐军、刘俊

展开 >

中北大学信息与通信工程学院,太原 030051

中北大学仪器与电子学院,太原 030051

偏振成像 目标检测 图像融合 YOLOv5 图像增强 注意力机制

国家自然科学基金国家自然科学基金国家自然科学基金山西省重点研发计划山西省优秀青年培育项目光电信息控制和安全技术重点实验室项目山西省量子传感与精密测量重点实验室项目山西省"1331工程"建设项目

619732815182100351922009202003D1110032021030212220112021JCJQLB055010201905D121001

2024

计算机测量与控制
中国计算机自动测量与控制技术协会

计算机测量与控制

CSTPCD
影响因子:0.546
ISSN:1671-4598
年,卷(期):2024.32(4)
  • 30