首页|Weak–Strong Synergy Learning With Random Grayscale Substitution for Cross-Modality Person Re-Identification

Weak–Strong Synergy Learning With Random Grayscale Substitution for Cross-Modality Person Re-Identification

扫码查看
Visible-infrared person re-identification (VI-ReID) is a rapidly emerging cross-modality matching problem that aims to identify the same individual across daytime visible modality and nighttime thermal modality. Existing state-of-the-art methods predominantly focus on leveraging image generation techniques to create cross-modality images or on designing diverse feature-level constraints to align feature distributions between heterogeneous data. However, challenges arising from color variations caused by differences in the imaging processes of spectrum cameras remain unresolved, leading to suboptimal feature representations. In this paper, we propose a simple yet highly effective data augmentation technique called Random Grayscale Region Substitution (RGRS) for the cross-modality matching task. RGRS operates by randomly selecting a rectangular region within a training sample and converting it to grayscale. This process generates training images that integrate varying levels of visible and channel-independent information, thereby mitigating overfitting and enhancing the model’s robustness to color variations. In addition, we design a weighted regularized triplet loss function for cross-modality metric learning and a weak–strong synergy learning strategy to improve the performance of cross-modal matching.We validate the effectiveness of our approach through extensive experiments conducted on publicly available cross-modality Re-ID datasets, including SYSU-MM01 and RegDB. The experimental results demonstrate that our proposed method significantly improves accuracy, making it a valuable training trick for advancing VT-ReID research.

adaptive grayscale effectdata augmentationdeep learningvisible-infrared person re-identification

Zexin Zhang

展开 >

School of Computer Science,Wuhan University, Wuhan, China

2025

Concurrency and computation: practice and experience

Concurrency and computation: practice and experience

ISSN:1532-0634
年,卷(期):2025.37(12/14)
  • 46