Few-Shot Incremental Learning Based on Attention Mechanism and Knowledge Distillation
Current few-shot learning mainly focuses on the performance on the few-shot categories data,while ignores the per-formance on the auxiliary set.To address this problem,a few-shot incremental learning model is proposed based on attention mecha-nism and knowledge distillation.The attention mechanism is adopted to learn the generalization ability on the few-shot data,the knowledge distillation is utilized to retain the discriminating ability on the auxiliary set.Therefore,the proposed model has an ac-cepted classification performance on both few-shot data and auxiliary set.Experiments show that the proposed model not only achieves excellent performance on few-shot data,but also does not suffer much performance loss on the auxiliary set.