计算机应用与软件2024,Vol.41Issue(12) :167-172,207.DOI:10.3969/j.issn.1000-386x.2024.12.024

基于多结构教师蒸馏的服装图像分类方法

CLOTHING IMAGE CLASSIFICATION BASED ON KNOWLEDGE DISTILLATION OF MULTI-STRUCTURE TEACHERS

张晓滨 刘昊
计算机应用与软件2024,Vol.41Issue(12) :167-172,207.DOI:10.3969/j.issn.1000-386x.2024.12.024

基于多结构教师蒸馏的服装图像分类方法

CLOTHING IMAGE CLASSIFICATION BASED ON KNOWLEDGE DISTILLATION OF MULTI-STRUCTURE TEACHERS

张晓滨 1刘昊1
扫码查看

作者信息

  • 1. 西安工程大学计算机学院 陕西 西安 710048
  • 折叠

摘要

针对当前大多数服装图像分类模型结构复杂、参数量大的问题,结合多教师知识蒸馏方法,提出一种基于多结构教师蒸馏的服装图像分类方法.该方法要点在于,选取学习了不同类型知识的多个蒸馏模型作为多教师网络,根据每个教师的模型表现自适应分配权重,协同作为监督来指导目标模型的学习,实现服装分类模型的轻量化改进.在服装数据集DeepFashion上的实验表明,该方法相对于同结构的服装分类模型约有 1.14 百分点的准确率提升,且模型本身仅有0.27M的参数量.

Abstract

In order to solve the problems of complex structure and large number of parameters in current garment image classification models,this paper proposes a garment image classification method based on multi-teacher knowledge distillation.The key point of this method is to select multiple distillation models with different types of knowledge as a multi-teacher network,assign weight adaptively according to the model performance of each teacher,and cooperate as a supervisor to guide the target network,so as to realize the lightweight improvement of the clothing classification model.Experiments on DeepFashion show that the proposed method has an accuracy improvement of about 1.14 percentage points compared with the clothing classification model with the same network structure,and the model itself has only 0.27 M parameters.

关键词

模型压缩/知识蒸馏/多教师知识蒸馏/服装图像分类

Key words

Model compression/Knowledge distillation/Multi-teacher knowledge distillation/Clothing image classification

引用本文复制引用

出版年

2024
计算机应用与软件
上海市计算技术研究所 上海计算机软件技术开发中心

计算机应用与软件

CSTPCD北大核心
影响因子:0.615
ISSN:1000-386X
段落导航相关论文