首页|基于实例归一化的通道注意力模块

基于实例归一化的通道注意力模块

扫码查看
传统通道注意力模块在构建池化层和下采样层获取特征图通道权重时,不仅增加大量参数而且极大提高了模型的复杂度。为解决上述问题,提出一种实例归一化通道注意力模块(Instance Normalization Channel Attention Module,INCAM),通过采用实例归一化中的缩放变量度量方差来捕捉特征图通道权重,仅增加少量参数即可获得明显的性能增益。在CIFAR-100 和CIFAR-10 数据集上的大量实验表明,相对于原始的ResNet-50,嵌入INCAM的ResNet-50 在Top-1 Error上降低了11。20%,而参数量仅增加了0。12%,并与其它注意力模块相比更加轻量和高效。
Channel Attention Module Based on Instance Normalization
Traditional channel attention modules not only increase a large number of parameters but also greatly improve the complexity of the model when constructing pooling and downsampling layers to obtain the channel weights of feature maps.In this paper,an Instance Normalization Channel Attention Module(INCAM)is proposed,which cap-tures the channel weight of the feature map by measuring the variance of the scaling variable in the instance normali-zation,and a significant performance gain can be achieved by adding only a small number of parameters.Extensive experiments on the CIFAR-100 and CIFAR-10 datasets show that the ResNet-50 embedded with INCAM reduces the Top-1 Error by 11.20%compared to the original ResNet-50,while the number of parameters increases by only 0.12%,and INCAM is lighter and more efficient compared to other attention modules.

Attention moduleConvolutional neural networkImage classificationInstance normalizationResNet

苏树智、蒋博文、陈润斌

展开 >

安徽理工大学计算机科学与工程学院,安徽 淮南 232001

注意力模块 卷积神经网络 图像分类 实例归一化 残差网络

国家自然科学基金中国博士后科学基金

618060062019M660149

2024

计算机仿真
中国航天科工集团公司第十七研究所

计算机仿真

CSTPCD
影响因子:0.518
ISSN:1006-9348
年,卷(期):2024.41(1)
  • 17