A Relative Positioning Method for Rover Based on Fusion of Camera Station Clouds
In response to the situation where the relative positioning of traditional feature point method inspectors is affected by the distance between rover camera stations and changes in image perspective,resulting in the failure of relative positioning of the rover,this article proposes a relative positioning method based on point cloud registration for rover,combining landing sequence images of landers and navigation images of rovers.Firstly,based on the landing sequence images,the 3D point cloud of the landing area is restored as the baseline map,and the initial position of the landing point is obtained.Secondly,the multi-scale cost aggregation stereo matching method with additional residual modules is used to perform precise 3D reconstruction of the camera station area of the rover.Finally,combining the 3D point cloud of the landing area with the camera station cloud,the SAC-IA+ICP method is used for point cloud registration,and reliable relative positioning parameters of the inspector are obtained through RANSAC.It conducts relative positioning experiments on the Chang'e-4 rover,and the results show that the proposed method has an average error of 0.56 m and 0.52 m in the x and y directions,and a maximum error of 0.11 m compared to the traditional bundle-adjustment positioning method.This indicates that the proposed method can provide valuable reference for intelligent perception and long-distance navigation positioning of the rover.