首页|Autonomous relative optical navigation based on unified modeling of external systematic errors
Autonomous relative optical navigation based on unified modeling of external systematic errors
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NETL
NSTL
万方数据
To solve the problem that external systematic errors of the optical camera cannot be fully estimated due to limited computing resources,a unified dimensionality reduction representation method for the external systematic errors of the optical camera is proposed,and autonomous rel-ative optical navigation is realized.The camera translational and misalignment errors are converted into a three-dimensional rotation error,whose differential model can be established through specific attitude control and appropriate assumption.Then,the rotation error and the relative motion state are jointly estimated in an augmented Kalman filter framework.Compared with the traditional method that estimates the camera translational and misalignment errors,the proposed method reduces the computational complexity in that the estimated state dimension is reduced.Further-more,as demonstrated by numerical simulation,the estimation accuracy is improved significantly.