首页|基于多尺度局部特征融合的行人重识别方法

基于多尺度局部特征融合的行人重识别方法

扫码查看
针对现有行人重识别方法在提取行人特征时存在特征不对齐、忽略相邻区域语义相关性、背景杂乱以及训练效率低的问题,提出一种多尺度局部特征融合的方法.首先引入空间变换网络对图像进行自适应仿射变换,实现行人空间特征对齐;接着横向均等分割不同尺度的特征图,对相邻局部块采取不同的拼接方式,以弥补切割造成的相邻块关联性信息缺失的问题;再融合全局特征与局部特征,挖掘二者之间的关联性.同时,融入随机擦除的方法对数据集进行处理,防止模型过拟合;并且使用多种损失函数对网络模型进行训练,提升模型的类内紧致性和类间差异性.将所提方法在Market-1501和DukeMTMC-ReID数据集上进行实验,Rank-1分别达到95.0%,88.8%,mAP分别达到89.2%,78.9%,结果表明所提方法能够提取更具判别力的行人特征.
Person Re-identification Method Based on Multi-scale Local Feature Fusion
Aiming at the problems of feature misalignment,ignoring semantic correlation between adjacent regions,background clutter and low training efficiency when extracting pedestrian features in existing person re-identification methods,a multi-scale local feature fusion method is proposed.Firstly,the spatial transformation network is introduced to perform adaptive affine trans-formation on the image to realize the alignment of pedestrian spatial features.Secondly,the feature maps of different scales are segmented horizontally,and the adjacent local blocks are spliced in different ways to make up for the lack of correlation informa-tion of adjacent blocks caused by cutting.Then,the correlation between global features and local features is mined.At the same time,the random erasure method is incorporated to process the data set to prevent the model from overfitting.And a variety of loss functions are used to train the network model to improve the intra-class compactness and inter-class diversity of the model.Finally,experiments are carried out on Market-1501 and DukeMTMC-ReID datasets,the Rank-1 reaches 95.0%and 88.8%,and the mAP reaches 89.2%and 78.9%,respectively.The results show that the proposed method can extract more discriminative pedestrian features.

Person re-identificationLocal featureFeature space segmentationSpatial transformation networkRandom erasure

吴蕾、王海瑞、朱贵富、赵江河

展开 >

昆明理工大学信息工程与自动化学院 昆明 650500

行人重识别 局部特征 特征空间分割 空间变换网络 随机擦除

国家自然科学基金国家自然科学基金

6186301661263023

2024

计算机科学
重庆西南信息有限公司(原科技部西南信息中心)

计算机科学

CSTPCD北大核心
影响因子:0.944
ISSN:1002-137X
年,卷(期):2024.51(z1)
  • 22