The application scenarios of deep learning object detection models are quite extensive.However,the detection accu-racy of deployed models is often low due to the performance limitations of deployment devices.To enhance the performance of de-tection models,this paper proposes an efficient dynamic distillation training method.This method innovatively introduces a dy-namic sample assignment strategy to select high-quality outputs of the teacher model,and pairs this with dynamic weight adjust-ment of distillation loss,thereby improving the traditional distillation algorithm used in object detection models.Experimental re-sults on a dataset for electrical grid safety construction indicate that,compared to direct training,this method increased the Aver-age Precision(AP)value of the YOLOv6-n model by an average of 2.63 percentage points.The distillation method proposed in this paper does not affect the inference speed of the original deployment model and helps to enhance the detection performance of object detection models in various industrial scenarios.
关键词
深度学习/目标检测/知识蒸馏
Key words
deep learning/object detection/knowledge distillation