Research on Augmented Reality Tracking and Localization Method Based on Multi-feature Fusion
For augmented reality-assisted aircraft assembly operations in large-scale scenarios,the mobile spatial localization accuracy and robustness of augmented reality devices directly affect the assembly guidance process.Based on this background,a multi-feature fusion visual Simultaneous Localization and Mapping(SLAM)tracking and localization method that integrates point and line features and artificial signs is proposed.Firstly,the artificial markers are detected between two adjacent frames to synchronize the extraction and matching of point and line features,and the artificial markers are used to calculate and recover the initial camera position while constructing the initial map containing the artificial markers;then the artificial markers are tracked in real time to obtain the initial value of the camera position,and the camera position is optimized by the reprojection error of the camera observed to the target point.Even if the camera tracking is lost,we can rely on the artificial marker information in the scene to assist the camera repositioning;finally,the artificial marker constraint information in the map is used to improve the accuracy of loopback correction and reduce the cumulative error of spatial localization.Test results in real aircraft assembly scenes and laboratory environments show that the method proposed in this paper can effectively improve the accuracy and stability of SLAM localization,which makes the virtual assembly model in the mobile scene more accurate and robust in terms of virtual registration.