首页|De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

De-biased knowledge distillation framework based on knowledge infusion and label de-biasing techniques

扫码查看
Knowledge distillation,as a pivotal technique in the field of model compression,has been widely applied across various domains.However,the problem of student model performance being limited due to inherent biases in the teacher model during the distillation process still persists.To address the inherent biases in knowledge distillation,we propose a de-biased knowledge distillation framework tailored for binary classification tasks.For the pre-trained teacher model,biases in the soft labels are mitigated through knowledge infusion and label de-biasing techniques.Based on this,a de-biased distillation loss is introduced,allowing the de-biased labels to replace the soft labels as the fitting target for the student model.This approach enables the student model to learn from the corrected model information,achieving high-performance deployment on lightweight student models.Experiments conducted on multiple real-world datasets demonstrate that deep learning models compressed under the de-biased knowledge distillation framework significantly outperform traditional response-based and feature-based knowledge distillation models across various evaluation metrics,highlighting the effectiveness and superiority of the de-biased knowledge distillation framework in model compression.

De-biasingDeep learningKnowledge distillationModel compression

Yan Li、Tai-Kang Tian、Meng-Yu Zhuang、Yu-Ting Sun

展开 >

School of Economics and Management,University of Electronic Science and Technology of China,Chengdu 611731,China

School of Economics and Management,Beijing University of Posts and Telecommunication,Beijing 100876,China

School of Electrical Engineering and Computer Science,The University of Queensland,Brisbane,4072,Australia

2024

电子科技学刊
电子科技大学

电子科技学刊

影响因子:0.154
ISSN:1674-862X
年,卷(期):2024.22(3)