现代信息科技2024,Vol.8Issue(16) :49-52,59.DOI:10.19850/j.cnki.2096-4706.2024.16.011

基于多分类器分级蒸馏的长尾视觉识别方法

Long-tailed Visual Recognition Method Based on Multi-classifier Graded Distillation

巩炫瑾
现代信息科技2024,Vol.8Issue(16) :49-52,59.DOI:10.19850/j.cnki.2096-4706.2024.16.011

基于多分类器分级蒸馏的长尾视觉识别方法

Long-tailed Visual Recognition Method Based on Multi-classifier Graded Distillation

巩炫瑾1
扫码查看

作者信息

  • 1. 福建理工大学 计算机科学与数学学院,福建 福州 350118
  • 折叠

摘要

为了提高模型在长尾视觉识别领域的性能,文章提出了一种多分类器分级蒸馏框架,该框架包括旋转自监督预训练和多分类器蒸馏.旋转自监督预训练通过预测图像旋转,平等地考虑每一张图像,减少模型受到长尾标签的影响.多分类器蒸馏通过三个专门优化的分类器将教师模型的知识一一对应蒸馏到学生模型.在开源的长尾图像识别数据集上进行了充分实验,并与现有方法进行了比较.实验结果表明,所提出的方法在长尾图像视觉识别方面取得了一定的提升.

Abstract

In order to enhance model performance in the long-tailed visual recognition domain,this paper proposes a multi-classifier graded distillation framework.The framework comprises rotation self-supervised pre-training and multi-classifier distillation.Rotation self-supervised pre-training treats each image equally by predicting image rotations,and mitigates the impact of long-tailed labels on the model.Multi-classifier systematically distills the knowledge from the teacher model to the student model through three specifically optimized classifiers.Extensive experiment results are conducted on open-source long-tailed image recognition datasets,and comparisons are made with existing methods.The experimental results demonstrate that the proposed method achieves notable improvements in long-tailed image visual recognition.

关键词

知识蒸馏/长尾分布/图像识别/深度学习模型

Key words

knowledge distillation/long-tailed distribution/image recognition/Deep Learning model

引用本文复制引用

出版年

2024
现代信息科技
广东省电子学会

现代信息科技

ISSN:2096-4706
段落导航相关论文