External Parameter Calibration Method of LiDAR and Camera Based on Semantic Segmentation in Automatic Driving Environment
Perception is an important technology in the field of autonomous driving,primarily relying on radar and cameras.To perceive accurate environmental information,it is necessary to accurately calibrate the external parameters of sensors.To solve the problem of calibration failure due to the inability in updating the calibration in a timely manner and car bumps,this study proposes an untargeted extrinsic calibration method suitable for urban streets.Buildings,vehicles,and road markings were selected as feature objects,and point clouds and image feature points were extracted.Based on the initial extrinsic,a random search algorithm was used to match the point cloud and image,and the optimal extrinsic was obtained based on the best matching result.Considering the KITTI dataset as an example,the feasibility and effectiveness of the method were verified through various experiments.The experimental results indicate that for rotational disturbances under 3°,the mean translation error remains under 0.095 m and the mean rotation error remains under 0.32°,indicating high accuracy.Compared with the CRLF method based on line features,the proposed method reduces the translation and rotation errors by 0.1 m and 0.55°,respectively,when only perturbing the rotation amount.Thus,the method is applicable to most urban street autonomous driving scenarios and exhibits good accuracy.