Compression method combining feature channel importance and similarity for deep YOLO network
Object detection methods based on deep YOLO networks suffer from the problems of complex network structures,redundant parameters,and high computational complexity,which greatly affect the detection performance of the model.Re-garding the above issues,we construct a deep YOLO network compression method that integrates feature channel importance and similarity to reduce the impact of inefficient and redundant channels in YOLO network.Based on the network pruning i-dea in deep network compression,the method uses a two-stage pruning approach to remove inefficient and redundant feature channels.Firstly,a channel importance calculation method is constructed,where sparsity factor is used as an indicator for channel inefficiency,and channels are pruned according to their sorting order and pruning rate.Secondly,the similarity be-tween channels is calculated based on the linear relationship between them,and channels with high similarity can be re-placed.After pruning,model parameters are fine-tuned to restore the detection accuracy before pruning.Through simulation experiments on real datasets for object detection,compared with current deep network compression schemes with better per-formance,the proposed method greatly reduces model size and improves detection speed while ensuring detection accuracy,demonstrating the feasibility and effectiveness of the method.
deep learningobject detectionYOLO networkfeature channel