单参数通道注意力模块
Single-parameter Channel Attention Module
姚亮亮 1张太红 1张洋宁 1温钊发1
作者信息
- 1. 新疆农业大学 计算机与信息工程学院,新疆 乌鲁木齐 830052;智能农业教育部工程研究中心,新疆 乌鲁木齐 830052;新疆农业信息化工程技术研究中心,新疆 乌鲁木齐 830052
- 折叠
摘要
随着深度学习的发展,通道注意力在卷积神经网络上的表征能力上发挥了巨大的作用.为了进一步加强通道注意力模块在深度神经网络中的作用,针对通道注意力的参数量方面,提出了一种单参数通道注意力(APA)模块.首先,APA模块在图像通道特征的求和向量上加单参数.然后,通过度量通道向量和求和向量在方向上的关系,求取通道注意力权重.最后,经过激活函数(Sigmoid)激活注意力权重,使其分布更平稳.与其他通道注意力模块相比,该模块只有微量参数,且该模块的代码实现非常简单.在数据集CIFAR-10 与CIFAR-100 上,使用APA模块嵌入到MobileNet,ResNet系列主干,与同类方法压缩激励模块(SE)、有效的通道注意力模块(ECA)进行了实验对比,验证了APA模块的有效性.
Abstract
With the development of deep learning,channel attention has played a huge role in the representation ability of convolutional neural networks.In order to further strengthen the role of channel attention module in deep neural network,a single parameter channel at-tention(APA)module is proposed for the parameter quantity of channel attention.First,the APA module adds a single parameter to the image channel feature summation vector.Then,the channel attention weight is obtained by measuring the relationship between the channel vector and the summation vector in the direction.Finally,the attention weight is activated by the activation function(Sigmoid)to make its distribution more stable.Compared with other channel attention modules,the proposed module has only trace parameters,and the code implementation of the module is largely simple.By embedding the APA module into the MobileNet and ResNet series backbones on the data sets CIFAR-10 and CIFAR-100,it is compared with the similar method Squeeze-and-Excitation module(SE)and Effective Channel Attention Module(ECA),which shows the effectiveness of the APA module.
关键词
卷积神经网络/深度学习/通道注意力/图像分类/计算量Key words
convolutional neural network/deep learning/channel attention/image classification/amount of calculation引用本文复制引用
基金项目
科技创新2030——"新一代人工智能"重大项目(2022ZD0115805)
新疆维吾尔自治区重大科技专项(2022A02011)
出版年
2023