首页|Multi-source unsupervised domain adaptation for object detection

Multi-source unsupervised domain adaptation for object detection

扫码查看
Domain adaptation for object detection has been extensively studied in recent years. Most existing approaches focus on single-source unsupervised domain adaptive object detection. However, a more practical scenario is that the labeled source data is collected from multiple domains with different feature distributions. The conventional approaches do not work very well since multiple domain gaps exist. We propose a Multi-source domain Knowledge Transfer (MKT) method to handle this situation. First, the low-level features from multiple domains are aligned by learning a shallow feature extraction network. Then, the high-level features from each pair of source and target domains are aligned by the followed multi-branch network. After that, we perform two parts of information fusion: (1) We train a detection network shared by all branches based on the transferability of each source sample feature. The transferability of a source sample feature means the indistinguishable degree to the target domain sample features. (2) For using our model, the target sample features output by the multi-branch network are fused based on the average transferability of each domain. Moreover, we leverage both image-level and instance-level attention to promote positive cross-domain transfer and suppress negative transfer. Our main contributions are the two-stage feature alignments and information fusion. Extensive experimental results on various transfer scenarios show that our method achieves the state-of-the-art performance.

Multi-source object detectionUnsupervised domain adaptationTransferabilityFeature fusion

Zhang, Dan、Ye, Mao、Xiong, Lin、Zhou, Lihua、Liu, Yiguang

展开 >

Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China

Sichuan Univ, Sch Comp Sci, Vis & Image Proc Lab, Chengdu 610065, Peoples R China

2022

Information Fusion

Information Fusion

EISCI
ISSN:1566-2535
年,卷(期):2022.78
  • 13
  • 68