Tightly coupled 3D indoor SLAM based on multi-sensor
Simultaneous Localization and Mapping(SLAM)is widely used in the field of mobile robots since it can solve the problem of localization and mapping in unknown environments.This paper proposes a SLAM method named 3D-MultiFus,utilizing radar,camera,IMU,and wheel odometry.The radar-IMU-odometry subsystem(Ls)rapidly constructs the geometric structure of global map by minimizing point-to-plane errors to estimate the system's positional state.Meanwhile,the camera-IMU-wheel odometry subsystem(Vs)removes occluded or depth-discontinuous feature points to minimize inter-frame map photometric errors,which further estimates the pose state and enables color rendering of the point cloud map within the sub-map.The tightly coupled data resulting from the fusion of IMU and odometry,radar system's point-to-plane errors,and camera system's photometric errors are processed using an error-state-based iterative Kalman filter(ESIKF),ensuring precision and robustness while simultaneously achieving rapid localization and mapping.To validate the localization and mapping accuracy of proposed algorithm,the 3D-MultiFus algorithm was compared with related algorithms based on an established indoor motion experimental scenario.Simulation and experimental results demonstrate that the 3D-MultiFus algorithm completes data processing in 185 ms,outperforming the operational efficiency of other algorithms.The long-term positional error between the initial and final positions is merely 0.085 6 m in complex indoor scenarios,significantly enhancing the global map accuracy of the 3D-MultiFus mobile robot.The constructed global map exhibits excellent consistency,validating the robust and reliable performance of the proposed algorithm in indoor environments.
multi-sensortight coupling3D-SLAMiteration Kalman based on error stateefficiency and precision