首页|基于多维动态卷积的激光雷达与相机外参标定方法

基于多维动态卷积的激光雷达与相机外参标定方法

扫码查看
随着自动驾驶汽车的快速发展,汽车需要来自多个传感器的不同数据来感知周围环境.激光雷达和相机的精确标定对自动驾驶汽车的数据融合至关重要.针对经典神经网络对图像数据特征提取不全面、不准确而导致激光雷达与相机外参标定精度低的问题,提出一种基于多维度动态卷积的激光雷达与相机外参标定方法.添加随机变换对数据进行预处理,将预处理后的数据输入基于多维度动态卷积的特征提取网络,然后经特征聚合输出旋转和平移向量,此外使用几何监督和转换监督来指导学习过程.实验结果表明,所提方法可以提升神经网络特征信息提取的能力,进一步提高了外参标定的精度.和对比方法中最优的结果相比,所提方法的平移预测的误差平均值减少了0.7 cm,验证了所提标定方法的有效性.
LiDAR and Camera External Parameter Calibration Method Based on Multi-Dimensional Dynamic Convolution
The rapid development of autonomous driving necessitates precise multisensor data fusion to accurately perceive the surrounding vehicular environment.Central to this is the precise calibration of LiDAR and camera systems,which forms the basis for effective data integration.Traditional neural networks,used for image feature extraction,often yield incomplete or inaccurate results,thereby undermining the calibration accuracy of LiDAR and camera parameters.Addressing this challenge,we propose a novel method hinged on multidimensional dynamic convolution for the extrinsic calibration of LiDAR and camera systems.Initially,data undergoes random transformations as a preprocessing step,followed by feature extraction through a specialized network based on multidimensional dynamic convolution.This network outputs rotation and translation vectors through feature aggregation mechanism.To guide the learning process,geometric and transformation supervisions are employed.Experimental validation suggests an enhancement in feature extraction capabilities of the neural network,leading to improved extrinsic calibration accuracy.Notably,our method exhibits a 0.7 cm reduction in the average error of translation prediction compared with the leading alternative approaches,substantiating the efficacy of the proposed calibration method.

machine visionLiDARexternal parameter calibrationdeep learning

张赛赛、于红绯

展开 >

辽宁石油化工大学人工智能与软件学院,辽宁 抚顺 113000

机器视觉 激光雷达 外参标定 深度学习

国家自然科学基金辽宁省教育厅高校基本科研项目

61702247LJKMZ20220723

2024

激光与光电子学进展
中国科学院上海光学精密机械研究所

激光与光电子学进展

CSTPCD北大核心
影响因子:1.153
ISSN:1006-4125
年,卷(期):2024.61(12)