In the anomaly detection method based on knowledge distillation,the teacher network is much larger than the student network,so that the obtained feature representation has different visual fields corresponding to the image at the same position.In order to solve this problem,the structure of student network and teacher network can be the same.However,However,in the tes-ting phase,the same student network and teacher network will lead to too small difference in their feature representation,which will affect the performance of anomaly detection.In order to solve this problem,ECA based multi-scale knowledge distillation anomaly detection(ECA-MSKDAD)is proposed,and a relative distance loss function is proposed based on data enhancement op-eration.The pre-trained network is used as the teacher network,and the network with the same network structure as the teacher network is used as the student network.In the training stage,the data enhancement operation is adopted for the training samples to expand the scale of the training set,and the efficient channel attention(ECA)module is introduced into the student network to increase the difference between the teacher network and the student network,increase the reconstruction error of the abnormal data and improve the detection performance of the model.In addition,the relative distance loss function is used to transfer the re-lationship between data from the teacher network to the student network,and the network parameters of the student network are optimized.Experiments on MVTec AD show that compared with nine related methods,the proposed method achieves better per-formance in anomaly detection and anomaly localization.
Deep learningAnomaly detectionAbnormal locationKnowledge distillationAttention mechanism