Neural Networks2022,Vol.14517.DOI:10.1016/j.neunet.2021.10.008

Interpolation consistency training for semi-supervised learning

Kawaguchi K. Lamb A. Kannala J. Solin A. Bengio Y. Lopez-Paz D. Verma V.
Neural Networks2022,Vol.14517.DOI:10.1016/j.neunet.2021.10.008

Interpolation consistency training for semi-supervised learning

Kawaguchi K. 1Lamb A. 2Kannala J. 3Solin A. 3Bengio Y. 2Lopez-Paz D. 1Verma V.2
扫码查看

作者信息

  • 1. Harvard University
  • 2. Montreal Institute for Learning Algorithms (MILA)
  • 3. Aalto University
  • 折叠

Abstract

? 2021 The AuthorsWe introduce Interpolation Consistency Training (ICT), a simple and computation efficient algorithm for training Deep Neural Networks in the semi-supervised learning paradigm. ICT encourages the prediction at an interpolation of unlabeled points to be consistent with the interpolation of the predictions at those points. In classification problems, ICT moves the decision boundary to low-density regions of the data distribution. Our experiments show that ICT achieves state-of-the-art performance when applied to standard neural network architectures on the CIFAR-10 and SVHN benchmark datasets. Our theoretical analysis shows that ICT corresponds to a certain type of data-adaptive regularization with unlabeled points which reduces overfitting to labeled points under high confidence values.

Key words

Consistency regularization/Deep Neural Networks/Mixup/Semi-supervised learning

引用本文复制引用

出版年

2022
Neural Networks

Neural Networks

EISCI
ISSN:0893-6080
被引量71
参考文献量36
段落导航相关论文