Image classification framework based on knowledge distillation
In order to solve the problem that it is difficult to effectively integrate the features of CNN and Transformer network in the image classification task,this paper proposes an image classification framework based on knowledge distillation:Knowledge distillation image classification(KDIC).In the KDIC framework,a variety of knowledge distillation methods are designed according to the difference of the network structure between CNNs and Transformer:this method effectively integrates the local features of CNNs and the global representation of Transformer into the lightweight student model,and proposes effective loss functions based on different knowledge distillation methods to improve the performance of image classification tasks.The image classification experiment was carried out on three public datasets,CIFAR10,CIFAR100 and UC-Merced.The experimental results show that the KDIC framework has obvious advantages over the current knowledge distillation method,and KDIC still has good performance and good generalization under different teacher and student networks.