To improve the accuracy of deep neural networks by enhancing the attention mechanism,a re-parameterized channel attention module(RCAM)was presented.It is pointed out that the channel compression method of squeeze-and-excitation net-works has a significant impact on the accuracy of the network.Therefore,this parameter reduction method was chosen to be abandoned as the basis.A channel re-parameterization module based on re-parameterization techniques was proposed,and this module was effectively combined with attention mechanisms.According to the integration strategy of ablation experiments,this attention module was placed into the backbone network.Experimental results indicate that on the public datasets CIFAR-100 and ImageNet-100,the RepVGG_A0 backbone network achieves accuracy improvements of 2.37%and 1.72%respectively compared to the networks without the addition of attention mechanisms.Similarly,when using the ResNet-18 backbone network,the accuracy improvements are 1.61%and 0.71%for CIFAR-100 and ImageNet-100 respectively,compared to the networks without attention mechanisms,which is compared to other well-known attention mechanisms,indicating that this attention mechanism is significantly better than several other attention mechanisms in enhancing the performance of the backbone network.