基于双目立体匹配与改进YOLOv8n-Pose关键点检测的奶牛体尺测量方法
Dairy cow body size measurement method based on binocular stereo matching and improved YOLOv8n-Pose keypoint detection
邓洪兴 1许兴时 1王云飞 1张姝瑾 1宋怀波1
作者信息
- 1. 西北农林科技大学机械与电子工程学院/农业农村部物联网重点实验室,陕西杨凌 712100
- 折叠
摘要
[目的]实现奶牛体尺准确测量,精准评定奶牛体型.[方法]针对奶牛体尺测量精度有限、自动化程度低等问题,提出一种基于双目立体匹配和改进YOLOv8n-Pose的奶牛体尺测量方法,利用CREStereo获取深度信息,在YOLOv8n-Pose中引入SimAM注意力机制,使网络更加关注奶牛个体识别及奶牛关键点位置信息,并采用CoordConv卷积改进网络结构,增强网络空间坐标感知能力.[结果]改进的YOLOv8n-Pose可快速准确检测奶牛体尺测量关键点,检测精度为 94.3%,模型参数量为 2.99 M,浮点计算量为 8.40 G,检测速度为 55.6 帧/s.融合双目立体匹配与改进YOLOv8n-Pose关键点检测的奶牛体尺测量最大平均相对误差为 4.19%.[结论]所提出的体尺测量方法具有较高的精度及较快的检测速度,能够满足奶牛体尺测量的实用要求.
Abstract
[Objective]To realize accurate measurement of dairy cow body size,and preicisely assess dairy cow body shape.[Method]Addressing the challenges of limited accuracy and low automation in measuring dairy cow body size,a body size measurement method based on binocula stereo matching and improved YOLOv8n-Pose was proposed.The deep learning-based CREStereo was applied for stereo matching to obtain depth information.In YOLOv8n-Pose,the SimAM attention mechanism was introduced to focus more on individual dairy cow identification and key point information.Additionally,the CoordConv was employed to enhance the network's spatial coordinate perception capability.[Result]The improved YOLOv8n-Pose achieved rapid and accurate detection of body size measurement key points for dairy cows.It attained a precision of 94.3%,with model parameters totaling 2.99 M and floating-point operations amounting to 8.40 G.The detection speed reached 55.6 frames/s.By combining stereo matching and improved YOLOv8n-Pose,the maximum average relative error in body size measurement was reduced to 4.19%.[Conclusion]The body size measurement method proposed in this paper achieves high accuracy and rapid detection speed,which can meet the practical requirements of body size measurement.
关键词
体尺测量/双目立体视觉/关键点检测/奶牛Key words
Body size measurement/Binocular stereo vision/Keypoint detection/Dairy cow引用本文复制引用
基金项目
国家重点研发计划(2023YFD1301800)
国家自然科学基金(32272931)
出版年
2024