首页|基于双目视觉惯性的同步定位与稠密场景重建方法

基于双目视觉惯性的同步定位与稠密场景重建方法

扫码查看
针对传统稠密同步定位与建图(SLAM)在高动态场景下位姿漂移且建图质量不佳的问题,提出了一种基于双目视觉惯性的同步定位与稠密场景重建方法.首先,针对位姿漂移问题,推导了双目相机提取三维路标点过程中的偏置信息方程,并采用惯性测量装置的观测量筛选路标点,提高系统的定位精度和鲁棒性.其次,针对建图效果不佳的问题,提出了一种CREStereo多视图视差估计网络与截断符号距离函数相结合的增量式地图构建方法,根据双目模型的有效视距剔除关键帧序列中错误的深度估计,并采用体素哈希方式动态构建全局一致的三维场景.最后,在EuRoC数据集下对算法的实时定位和三维重建性能进行对比测试.实验结果表明,高动态场景下所提算法的定位精度相较于VINS-Mono提高了 9.20%,在高动态和弱纹理条件下能够实现高质量建图.
Simultaneous localization and dense scene reconstruction approach based on stereo VIO
Aiming at the problems of pose drift and poor mapping quality of traditional dense simultaneous localization and mapping(SLAM)in highly dynamic scenes,a simultaneous localization and dense scene reconstruction approach based on stereo visual-inertial odometry(VIO)is proposed.Firstly,to tackle the issue of pose drift,the bias information equation in the process of extracting 3D landmark points with a stereo camera is deduced,and the observations of inertial measurement unit are used to filter landmark points to improve the positioning accuracy and robustness of the system.Then,to improve the mapping quality,an incremental map construction scheme is designed that combines the CREStereo multi-view disparity estimation network with truncated signed distance function(TSDF),which eliminates erroneous depth estimation in the keyframe sequence based on the effective visual range of the stereo model and employs voxel Hashing to dynamically build a globally consistent 3D scene.Finally,the real-time positioning and 3D reconstruction performance of the proposed algorithm are comparatively tested on the EuRoC dataset.The experimental results show that the positioning accuracy of the proposed algorithm is improved by 9.20% compared with VINS-Mono in high dynamic scenes,and can achieve high quality map construction under high dynamic and weak texture conditions.

dense SLAMvisual-inertial odometrydepth estimation3D reconstruction

张小国、沈德玉、张梓涵、郑子豪

展开 >

微惯性仪表与先进导航技术教育部重点实验室,南京 210096

东南大学 仪器科学与工程学院,南京 210096

东南大学 软件学院,苏州 215123

稠密SLAM 视觉惯性里程计 深度估计 三维重建

2024

中国惯性技术学报
中国惯性技术学会

中国惯性技术学报

CSTPCD北大核心
影响因子:0.792
ISSN:1005-6734
年,卷(期):2024.32(11)