首页|基于知识蒸馏的图像分类框架

基于知识蒸馏的图像分类框架

扫码查看
为解决在图像分类任务中难以有效融合CNNs与Transformer网络特征的问题,提出了 一种基于知识蒸馏的图像分类框架(Knowledge distillation image classification,KDIC).KDIC框架中根据CNNs与Transformer网络结构的差异设计了多种知识蒸馏方法:本方法有效地将CNNs的局部特征与Transformer的全局表示融入轻量的student模型中,并基于不同的知识蒸馏方法提出有效的损失函数来提升图像分类任务的性能.图像分类实验在CIFAR10、CIFAR100、UC-Merced 3个公开数据集上进行,实验结果表明;KDIC框架与当前的知识蒸馏方法相比有着明显的优势,同时KDIC在不同师生网络下仍然具有良好的性能和泛化性.
Image classification framework based on knowledge distillation
In order to solve the problem that it is difficult to effectively integrate the features of CNN and Transformer network in the image classification task,this paper proposes an image classification framework based on knowledge distillation:Knowledge distillation image classification(KDIC).In the KDIC framework,a variety of knowledge distillation methods are designed according to the difference of the network structure between CNNs and Transformer:this method effectively integrates the local features of CNNs and the global representation of Transformer into the lightweight student model,and proposes effective loss functions based on different knowledge distillation methods to improve the performance of image classification tasks.The image classification experiment was carried out on three public datasets,CIFAR10,CIFAR100 and UC-Merced.The experimental results show that the KDIC framework has obvious advantages over the current knowledge distillation method,and KDIC still has good performance and good generalization under different teacher and student networks.

computer applicationknowledge distillationimage classificationconvolution neural network

赵宏伟、武鸿、马克、李海

展开 >

吉林大学计算机科学与技术学院,长春 130012

吉林大学机械与航空航天工程学院,长春 130025

计算机应用 知识蒸馏 图像分类 卷积神经网络

2024

吉林大学学报(工学版)
吉林大学

吉林大学学报(工学版)

CSTPCD北大核心
影响因子:0.792
ISSN:1671-5497
年,卷(期):2024.54(8)