首页|面向农机自动驾驶的农田边界线预测方法

面向农机自动驾驶的农田边界线预测方法

扫码查看
农田边界的快速准确提取是自动驾驶农机在田间自主安全作业的基础,也能为农场数字化管理提供基础数据.传统的农田边界线图像特征提取精度不高且边界线提取不完整.本研究构建了农田图像标注数据集并提出了一种基于无人机遥感图像的农田边界获取方法,设计了一种基于DeeplabV3+的改进语义分割模型,利用边界追踪函数追踪二值图像边界,剔除离群点后通过最小二乘算法得到拟合后的边界线.田间试验结果表明,在作物覆盖农田与未被作物覆盖农田上,该网络的交并比分别为92.78%和92.69%,平均像素精度为93.96%,提取边界线的平均垂直误差和平均角度误差分别为4%和0.62°.本研究可为农机自动驾驶的定位与路径规划提供技术支撑.
Farmland Boundary Line Prediction Method for Autonomous Agricultural Machines
The fast and accurate extraction of farmland boundary is the basis for autonomous and safe operation of self-driving farm machines in the field,and it can also provide basic data for the digital management of the farm.The traditional feature extraction of farmland boundary line images has low accuracy and incomplete boundary line extraction.In this study,we constructed a farmland image annotation dataset and proposed a farmland boundary acquisition method based on UAV remote sensing images,designed an improved semantic segmentation model based on DeeplabV3+,tracked the boundary of the binary image using the boundary tracking function,and obtained the fitted boundary line through the least-squares algorithm after eliminating the outlier points.The results of field experiments show that the IoU of the network is 92.78%and 92.69%for crop-covered and non-crop-covered farmland,respectively,the mPA is 93.96%,and the mean vertical error and the mean angular error of the extracted boundary line were 4%and 0.62°,respectively.This study can provide technical support for the positioning and path planning of automatic driving of agricultural machines.

Automated driving of agricultural machineryBoundary extractionSemantic segmentationLinear fitting

卢昊、王昊、付卫强、温昌凯、梅鹤波、单永超、秦维贤、孟志军

展开 >

黑龙江八一农垦大学信息与电气工程学院,黑龙江大庆 163319

北京市农林科学院智能装备技术研究中心,北京 100097

智能农业动力装备全国重点实验室,北京 100097

农机自动驾驶 边界提取 语义分割 线性拟合

国家重点研发计划(十四五)国家自然科学基金面上项目北京市农林科学院杰出科研人才引进项目北京市农林科学院青年基金中国博士后科学基金面上项目

2022YFD200160531971800QNJJ2023202022M720490

2024

拖拉机与农用运输车
洛阳拖拉机研究所 洛阳西苑车辆与动力检验所有限公司 中国农业机械学会拖拉机分会

拖拉机与农用运输车

影响因子:0.157
ISSN:1006-0006
年,卷(期):2024.51(2)
  • 46