首页|NT-FAN: A simple yet effective noise-tolerant few-shot adaptation network

NT-FAN: A simple yet effective noise-tolerant few-shot adaptation network

扫码查看
Few-shot domain adaptation (FDA) aims to train a target model with clean labeled data from the source domain and few labeled data from the target domain. Given a limited annotation budget, source data may contain many noisy labels, which can detrimentally impact the performance of models in real-world applications. This problem setting is denoted as wildly few-shot domain adaptation (WFDA), simultaneously taking care of label noise and data shortage. While previous studies have achieved some success, they typically rely on multiple adaptation models to collaboratively filter noisy labels, resulting in substantial computational overhead. To address WFDA more simply and elegantly, we offer a theoretical analysis of this problem and propose a comprehensive upper bound for the excess risk on the target domain. Our theoretical result reveals that correct domain-invariant representations can be obtained even in the presence of source noise and limited target data without incurring additional costs. In response, we propose a simple yet effective WFDA method, referred to as noise-tolerant few-shot adaptation network (NT-FAN). Experiments demonstrate that our method significantly outperforms all the state-of-the-art competitors while maintaining a more lightweight architecture. Notably, NT-FAN consistently exhibits robust performance when dealing with more realistic and intractable source noise (e.g., instance-dependent label noise) and severe source noise (e.g., a 40% noise rate) in the source domain.

Representation learningWeak-supervised learningFew-shot learningTransfer learning

Wenjing Yang、Haoang Chi、Yibing Zhan、Bowen Hu、Xiaoguang Ren、Dapeng Tao、Long Lan

展开 >

College of Computer Science and Technology, National University of Defense Technology, Changsha, 410073, Hunan, PR China

Academy of Military Sciences, Beijing, 100071, PR China

JD Explore Academy, Beijing, 100176, PR China

Yunnan University, Kunming, 650091, Yunnan, PR China

展开 >

2025

Artificial intelligence

Artificial intelligence

SCI
ISSN:0004-3702
年,卷(期):2025.346(Sep.)
  • 111