首页|联合静动态关节关系的3D人体姿态估计

联合静动态关节关系的3D人体姿态估计

扫码查看
针对现有三维(3D)姿态估计方法在复杂场景下难以准确预测人体关键点位置的问题,提出了一种新的3D姿态估计方法,目的在于应对关节遮挡和姿态奇异等挑战,提升模型在复杂场景下的准确性.通过互信息计算得到人体关节图谱,以此引导关节分组,并基于人体结构的三级自由度进行归集;然后,利用级联估计和关节分组特征共享网络,对静态关节关系建模,并设计了多分组注意力机制,以建模动态关节关系.此外,还引入类别平衡姿态重组策略来拓展数据,增强了模型的泛化能力.实验结果显示,所提出的模型在Human3.6M,MPI-INF-3DHP和MPII等数据集上的表现优异,关键点位置的平均误差较现有模型至少降低了 0.2 mm,精度至少提高了 0.2%,能够有效提高3D姿态估计方法的整体性能,尤其是在复杂场景下具有显著优势.
Modeling Static and Dynamic Joint Relationship for 3D Pose Estimation
In order to solve the problem that the existing three-dimensional(3D)pose estimation methods are difficult to accurately predict the position of key points of the human body in complex scenes,a 3D pose estimation method combining static and dynamic joint relationships was proposed.The purpose is to overcome the challenges of joint occlusion and posture singularity,and improve the accuracy of the model in complex scenes.The human joint map is obtained through mutual information calculation,which is used to guide the joint grouping and clustered based on the three-level degrees of freedom of the human structure.The static joint relationship model is constructed by using the cascade estimation and joint grouping feature sharing network,and the dynamic joint relationship model is designed by using the multi-group attention mechanism.In addition,a category balancing and pose reorganizing strategy is introduced to expand the variety of the data and to generalize the model's capability.The experiments demonstrates that the proposed model performs well on datasets such as Human3.6M,MPI-INF-3DHP and MPII.The results show that average error of key point positions for our model is reduced by at least 0.2 mm and the average accuracy is increased by at least 0.2% when compared with currently available models,which can effectively improve the overall performance of 3D pose estimation,especially in complex scenes.

joint occlusionsingularity posesemantic featurejoint relationshipthree dimensions pose estimation

刘琼、何建航、温嘉校

展开 >

华南理工大学软件学院,广州 510006

关节遮挡 奇异姿态 语义特征 关节关系 三维姿态估计

2024

北京邮电大学学报
北京邮电大学

北京邮电大学学报

CSTPCD北大核心
影响因子:0.592
ISSN:1007-5321
年,卷(期):2024.47(5)