基于特征关联的双视场目标交接方法
Method for Dual-Field-of-View Target Handoff Based on Feature Association
伊丽娜 1王向军 1王霖 1徐宗伟2
作者信息
- 1. 天津大学精密测试技术及仪器全国重点实验室,天津 300072;天津大学微光机电系统技术教育部重点实验室,天津 300072
- 2. 天津大学精密测试技术及仪器全国重点实验室,天津 300072
- 折叠
摘要
针对双摄像头视觉系统在目标交接时面临的尺度变化挑战,提出了一种基于特征关联的双视场目标交接方法.首先通过单应矩阵在视场切换后对目标进行初步定位,随后利用优化的YOLOv5目标检测网络搜索候选目标,最终使用改进的OSNet网络进行特征关联.为提升目标交接的准确性,优化了YOLOv5的损失函数,并在OSNet中引入瓶颈注意力模块及余弦距离度量函数.在CrowdHuman数据集和Market-1501数据集上的实验结果表明,优化后的YOLOv5网络的平均精度提升了1.0百分点,达到38.5%.改进后的OSNet网络的平均精度均值提高了5.4百分点,达到68.1%.将本文方法部署在瑞芯微RK3399Pro嵌入式平台,搭载分辨率均为1600×1200、焦距分别为35 mm和8 mm的60 frame/s摄像头进行外场实验,实验结果表明,本文方法能够在14 frame内完成目标的精准交接,验证了其在实际监控场景中的可行性与稳定性.
Abstract
To address the scale variation challenges during target handoff in dual-camera systems,this paper proposes a dual-field-of-view target handoff method based on feature association.The approach initially localizes the target in the switched field of view using a homography matrix and then employs an optimized YOLOv5 object detection network to search for candidate targets.Finally,this approach uses an enhanced OSNet network for feature association.To improve the accuracy of target handoff,the loss function of YOLOv5 was optimized.Additionally,the bottleneck attention module and cosine distance metric were introduced into OSNet.Experimental results on the CrowdHuman and Market-1501 datasets indicate that the optimized YOLOv5 network increases the average precision by 1.0 percentage point,achieving a precision of 38.5%.The mean average precision of the improved OSNet network increases by 5.4 percentage point,reaching 68.1%.When deployed on the Rockchip RK3399Pro embedded platform,equipped with 60 frame/s cameras of resolutions 1600×1200 and focal lengths of 35 mm and 8 mm,respectively,this approach accurately completes the target handoff within 14 frames,demonstrating the feasibility and stability of the proposed method in real-world surveillance scenarios.
关键词
特征关联/目标交接/深度学习/注意力机制/相似度度量Key words
feature association/target handoff/deep learning/attention mechanism/similarity measurement引用本文复制引用
出版年
2024