首页|基于多尺度目标检测的人机协作装配场景认知方法

基于多尺度目标检测的人机协作装配场景认知方法

扫码查看
人机协作装配场景的快速理解对提高协作机器人认知能力、实现人机协作装配具有重要的现实意义.针对非结构化人机协作装配场景认知过程中目标尺度差异较大、缺少统一场景描述框架等问题,首先构建了一种轻量级多尺度日标检测网络LMS-Net,并在网络训练过程中引入目标检测框聚类机制以提高多尺度目标检测精度;然后将LMS-Net检测结果转换为人物交互图并建立了人机协作装配场景元描述模型,提出了基于多尺度目标检测的人机协作装配场景认知方法.在自建数据集HRC-Action上的实验结果表明所构建多尺度目标检测网络具有较高的准确率(平均89%)和较快的速度(深度学习工作站平均58.7 FPS,Jetson Nano B01嵌入式开发板平均25 FPS),所提人机协作装配场景认知方法具有较好的可行性和实用性.
Cognitive approach for human-robot collaborative assembly scene based on multi-scale object detection
The rapid understanding of Human-Robot Collaborative(HRC)assembly scene is of great practical signifi-cance to improve the cognitive ability of collaborative robot and realize HRC assembly.Aiming at the problems of large difference of object scale and the lack of unified scene description framework in the cognitive process of un-structured HRC assembly scene,a Lightweight Multi-Scale object detection Network(LMS-Net)was constructed,and the anchor clustering mechanism was introduced in the network training process to improve the accuracy of multi-scale object detection.Then,the LMS-Net detection results were converted into human-object interaction graph,and a meta-description model of HRC assembly scene was established.A cognitive method of HRC assembly scene based on multi-scale object detection was proposed.Experimental results on the self-built dataset HRC-Action showed that the proposed multi-scale object detection network had high accurate(average 89%)and faster speed(average 58.7FPS on deep learning workstation,average 25FPS on Jetson Nano B01),and the proposed HRC as-sembly scene cognitive method had good feasibility and practicability.

human-robot collaborative assemblyscene cognitiveobject detectionhuman-object interaction graph

董元发、严华兵、刘勇哲、彭巍、周彬、方子帆

展开 >

三峡大学水电机械设备设计与维护湖北省重点实验室,湖北 宜昌 443002

三峡大学机械与动力学院,湖北 宜昌 443002

三峡大学智能制造创新技术中心,湖北 宜昌 443002

人机协作装配 场景认知 目标检测 人物交互图

国家自然科学基金湖北省自然科学基金湖北省水电机械设备设计与维护重点实验室(三峡大学)开放基金

520752922023AFB11162020KJX05

2024

计算机集成制造系统
中国兵器工业集团第210研究所

计算机集成制造系统

CSTPCD北大核心
影响因子:1.092
ISSN:1006-5911
年,卷(期):2024.30(5)
  • 22