Neural Networks2022,Vol.15212.DOI:10.1016/j.neunet.2022.05.015

Semantic consistency learning on manifold for source data-free unsupervised domain adaptation

Zou, Yan Song, Zihao Lyu, Jianzhi Chen, Lijuan Ye, Mao Zhong, Shouming Zhang, Jianwei Tang, Song
Neural Networks2022,Vol.15212.DOI:10.1016/j.neunet.2022.05.015

Semantic consistency learning on manifold for source data-free unsupervised domain adaptation

Zou, Yan 1Song, Zihao 1Lyu, Jianzhi 2Chen, Lijuan 2Ye, Mao 3Zhong, Shouming 3Zhang, Jianwei 2Tang, Song1
扫码查看

作者信息

  • 1. Inst Machine Intelligence IMI,Univ Shanghai Sci & Technol
  • 2. Tech Aspects Multimodal Syst TAMS Grp,Univ Hamburg
  • 3. Univ Elect Sci & Technol China
  • 折叠

Abstract

Recently, source data-free unsupervised domain adaptation (SFUDA) attracts increasing attention. Current work shows that the geometry of the target data is helpful to solving this challenging problem. However, these methods define the geometric structures in Euclidean space. The geometry cannot completely draw the semantic relationship between the target data distributed on a manifold. This article proposed a new SFUDA method, semantic consistency learning on manifold (SCLM), to address this problem. Firstly, we generated pseudo-labels for the target data using a new clustering method, EntMomClustering, that enhanced k-means clustering by fusing the entropy momentum. Secondly, we constructed semantic neighbor topology (SNT) to capture complete geometric information on the manifold. Specifically, in SNT, the global neighbor was detected by a developed collaborative representation-based manifold projection, while the local neighbors were obtained by similarity comparison. Thirdly, we performed a semantic consistency learning on SNT to drive a new kind of deep clustering where SNT was taken as the basic clustering unit. To ensure SNT move as entirety, in the developed objective, the entropy regulator was constructed based on a semantic mixture fused on SNT, while the self-supervised regulator encouraged similar classification on SNT. Experiments on three benchmark datasets show that our method achieves state-of-the-art results. The code is available on https://github.com/tntek/SCLM.

Key words

Unsupervised domain adaptation/Semantic consistency/Manifold/Self-supervised learning/SPARSE REPRESENTATION

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量7
参考文献量52
段落导航相关论文