首页|联合局部多尺度和全局上下文特征的步态识别

联合局部多尺度和全局上下文特征的步态识别

扫码查看
现有步态识别方法在空间上能提取丰富的步态信息,但是在时间上通常忽略局部区域内的细粒度时间特征和不同子区域间的时间上下文信息。考虑到步态识别为细粒度识别问题同时每个人行走的时间上下文信息具有独特性,提出一种联合局部多尺度和全局上下文时间特征的步态识别方法。将整个步态序列按多个时间分辨率划分并提取局部子序列内的多分辨率细粒度时间特征。在子序列之间基于Transformer提取时间上下文信息,并基于上下文信息融合所有子序列形成全局特征。在 2 个公开数据集上进行大量的实验,在CASIA-B数据集的 3 种行走状态下取得 98。0%、95。4%和 87。0%的rank-1 准确率,在OU-MVLP数据集上取得 90。7%的rank-1准确率。本文提出的方法得到的结果可为其他步态识别方法提供参考。
Gait recognition with united local multiscale and global context features
Existing gait recognition methods can extract rich gait information in the spatial dimension.However,they often overlook fine-grained temporal features within local regions and temporal contextual information across different sub-regions.Considering that gait recognition is a fine-grained recognition problem,and each individual's gait carries unique temporal context information,we propose a gait recognition method that combines local multiscale and global contextual temporal features.The entire gait sequence is divided into multiple time resolutions and fine-grained tempor-al features within local sub-sequences are extracted.Transformer is used to extract temporal context information among different subsequences,and the global features are formed by integrating all subsequences based on the contextual in-formation.We have conducted extensive experiments on two public datasets.The proposed model achieves rank-1 ac-curacies of 98.0%,95.4%,and 87.0%on three walking conditions of the CASIA-B dataset.On the OU-MVLP dataset,the model achieves a rank-1 accuracy of 90.7%.The method proposed in this paper has achieved state-of-the-art results and can provide reference for other gait recognition methods.

biometric identificationgait recognitioncross-viewconvolutional neural networksdeep learningresid-ual connectionfine-grainedattention mechanism

李浩淼、张含笑、邢向磊

展开 >

哈尔滨工程大学 智能科学与工程学院,黑龙江 哈尔滨 150001

生物识别 步态识别 跨视角 卷积神经网络 深度学习 残差链接 细粒度 注意力机制

国家自然科学基金项目国家自然科学基金项目

6207607861703119

2024

智能系统学报
中国人工智能学会 哈尔滨工程大学

智能系统学报

CSTPCD北大核心
影响因子:0.672
ISSN:1673-4785
年,卷(期):2024.19(4)