Feature distribution guided binary neural networks
In recent years,binary neural networks(BNNs)have received attention due to their small memory consumption and high computational efficiency.However,there exists a significant performance gap between BNNs and floating-point deep neural networks(DNNs)due to problems,such as imbalanced distributions of positive and negative parts of quantized activation features,which affects their deployment on resource-constrained platforms.The main reason for the limited accuracy of binary networks is the information loss caused by feature discretization and the disappearance of semantic information caused by improper distribution optimization.To address this problem,this paper applies feature distribution adjustment to guide binarization,which adjusts the mean-variance of features to balance the feature distribution and reduce the information loss caused by discretization.At the same time,through the design of group excitation and feature fine-tuning module,the quantization zero points are optimized to balance the binarization activation distributions and retain the semantic information to the maximum extent.Experiments show that the proposed method achieves better results on different backbone networks using different datasets,in which only 0.4%of accuracy is lost after binarizing ResNet-18 on CIFAR-10,which surpasses the current mainstream BNNs.
feature distributionmean and variance adjustmentsemantic information speicherungmodel compressionbinary neural networksneural network quantization