首页|基于LSTM网络的机器人异空间手眼标定方法

基于LSTM网络的机器人异空间手眼标定方法

扫码查看
针对现有机器人操作空间与相机视野异空间的情况,基于长短期记忆(LSTM)网络,提出了一种新颖的机器人异空间手眼标定方法.首先依次提取并记录标定板上各圆心的像素坐标,然后利用传送带将标定板平移送至机器人工作空间内,并记录机器人末端顶针的位姿信息.其次利用LSTM网络数据训练获得手眼映射关系.最后使用采集的36组真实数据作为验证集来验证预测精度.结果表明,该方法训练的模型所预测的机器人基坐标系坐标平均平移误差仅为0.69 mm,并且针对随机分布于传送带所有工作空间中的验证集数据,平移误差波动值均小于1 mm,有效验证了该方法的鲁棒性和有效性.相较于经典平面标定方法,本文所提出的方法有效工作空间大,标定精度高,并且可以有效补偿相机镜头畸变、深度值变化等因素所带来的误差.
Robot hand-eye calibration in different spaces based on LSTM network
We present a novel robot hand-eye calibration method in an alien space based on the long short-term memory(LSTM)network,addressing the mismatch between the current robot operating space and camera field of view.Firstly,pixel coordinates of each circle center on the calibration board were extracted and recorded sequentially.Subsequently,the calibration board was translated into the robot workspace using a conveyor belt,and the pose information of the robot end effector was recorded.The LSTM network was then employed for data training to establish the hand-eye mapping relationship.Finally,36 sets of real data were collected as a validation set to assess prediction accuracy.The results indicate that the model trained by this method predicts with an average translation error of only 0.69 mm in the robot base coordinate system.Moreover,for the validation dataset distributed randomly across all workspaces on the conveyor belt,the fluctuation value of translation error is less than 1 mm,effectively affirming the robustness and effectiveness of this method.In comparison to classical planar calibration methods,the proposed method operates efficiently in larger workspaces,provides high calibration accuracy,and can effectively compensate for errors arising from factors such as camera lens distortion and depth value changes.

roboteye-hand calibrationLSTM networkdifferent spaces

乐恒韬、赵康康、吴松林、付中涛、陈绪兵

展开 >

武汉工程大学机电工程学院,湖北 武汉 430205

智能焊接装备与软件工程技术湖北省研究中心(武汉工程大学),湖北 武汉 430205

机器人 手眼标定 LSTM网络 异空间

2024

武汉工程大学学报
武汉工程大学

武汉工程大学学报

影响因子:0.463
ISSN:1674-2869
年,卷(期):2024.46(5)