Channel Attention Module Based on Instance Normalization
Traditional channel attention modules not only increase a large number of parameters but also greatly improve the complexity of the model when constructing pooling and downsampling layers to obtain the channel weights of feature maps.In this paper,an Instance Normalization Channel Attention Module(INCAM)is proposed,which cap-tures the channel weight of the feature map by measuring the variance of the scaling variable in the instance normali-zation,and a significant performance gain can be achieved by adding only a small number of parameters.Extensive experiments on the CIFAR-100 and CIFAR-10 datasets show that the ResNet-50 embedded with INCAM reduces the Top-1 Error by 11.20%compared to the original ResNet-50,while the number of parameters increases by only 0.12%,and INCAM is lighter and more efficient compared to other attention modules.