A Regularized Class Incremental Learning Method Based on Self-Supervision with Distillation Constraints
Aiming at the problem of catastrophic forgetting in incremental learning of neural network mod-els,a regularized class of incremental learning method based on self-supervision with hidden layer distilla-tion constraints is proposed,including pseudo label prediction,knowledge distillation and parameter regu-larization.First,a regularization constraint method based on Bayesian and information theory is proposed for the importance evaluation of model parameters,and then the characterization ability of the model is en-hanced by using self-supervised pseudo label prediction,and the hidden layer features are preserved by adding Gaussian noise to improve the generalization ability of the features.The hidden layer features and output layer features of the historical task are trained using a distillation constraint method with cross-entropy classification loss.The experimental results show that better results are achieved on the CIFAR-10 and CIFAR-100 datasets,where the average accuracy and forgetting rates reach 64.16%and 15.95%,respectively,on the CIFAR-100 dataset.The proposed method is effective in reducing the effects of catastrophic forgetting.