Person Re-identification Method Based on Multi-scale Local Feature Fusion
Aiming at the problems of feature misalignment,ignoring semantic correlation between adjacent regions,background clutter and low training efficiency when extracting pedestrian features in existing person re-identification methods,a multi-scale local feature fusion method is proposed.Firstly,the spatial transformation network is introduced to perform adaptive affine trans-formation on the image to realize the alignment of pedestrian spatial features.Secondly,the feature maps of different scales are segmented horizontally,and the adjacent local blocks are spliced in different ways to make up for the lack of correlation informa-tion of adjacent blocks caused by cutting.Then,the correlation between global features and local features is mined.At the same time,the random erasure method is incorporated to process the data set to prevent the model from overfitting.And a variety of loss functions are used to train the network model to improve the intra-class compactness and inter-class diversity of the model.Finally,experiments are carried out on Market-1501 and DukeMTMC-ReID datasets,the Rank-1 reaches 95.0%and 88.8%,and the mAP reaches 89.2%and 78.9%,respectively.The results show that the proposed method can extract more discriminative pedestrian features.
Person re-identificationLocal featureFeature space segmentationSpatial transformation networkRandom erasure