Neural Networks2022,Vol.14511.DOI:10.1016/j.neunet.2021.10.013

Cross-attention-map-based regularization for adversarial domain adaptation

Li J. Wang H. Wu K. Liu C. Tan J.
Neural Networks2022,Vol.14511.DOI:10.1016/j.neunet.2021.10.013

Cross-attention-map-based regularization for adversarial domain adaptation

Li J. 1Wang H. 1Wu K. 1Liu C. 1Tan J.1
扫码查看

作者信息

  • 1. Institute of Automation Chinese Academy of Sciences
  • 折叠

Abstract

? 2021 Elsevier LtdIn unsupervised domain adaptation (UDA), many efforts are taken to pull the source domain and the target domain closer by adversarial training. Most methods focus on aligning distributions or features between the source domain and the target domain. However, little attention is paid to the interaction between finer-grained levels, such as classes or samples of the two domains. In contrast to UDA, another transfer learning task, i.e., few-shot learning (FSL), takes full advantage of the finer-grained-level alignment. Many FSL methods implement the interaction between samples of support sets and query sets, leading to significant improvements. We wonder whether we can get some inspiration from these methods and bring such ideas of FSL to UDA. To this end, we first take a closer look at the differences between FSL and UDA and bridge the gap between them by high-confidence sample selection (HCSS). Then we propose cross-attention map generation module (CAMGM) to interact samples selected by HCSS. Moreover, we propose a simple but efficient method called cross-attention-map-based regularization (CAMR) to regularize the feature maps generated by the feature extractor. Experiments on three challenging datasets demonstrate that CAMR can bring solid improvements when added to the original objective. More specifically, the proposed CAMR can outperform original methods by 1% to 2% in most tasks without bells and whistles.

Key words

Attention mechanism/Contrastive learning/Domain adaptation/Few-shot learning

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量8
参考文献量60
段落导航相关论文