闽南师范大学学报(自然科学版)2024,Vol.37Issue(3) :71-83.DOI:10.12457/j.issn.2095-7122.2024.03.006

渐进特征融合注意力机制图像超分重建网络

Progressive feature fusion attention mechanism network for image super-resolution reconstruction

曾伯儒 胡思宇
闽南师范大学学报(自然科学版)2024,Vol.37Issue(3) :71-83.DOI:10.12457/j.issn.2095-7122.2024.03.006

渐进特征融合注意力机制图像超分重建网络

Progressive feature fusion attention mechanism network for image super-resolution reconstruction

曾伯儒 1胡思宇1
扫码查看

作者信息

  • 1. 闽南师范大学计算机学院,福建 漳州 363000
  • 折叠

摘要

针对基于深度卷积神经网络的图像超分辨率方法存在网络规模过大而无法部署在资源受限设备上的问题,提出一种轻量型渐进特征融合注意力机制网络(progressive attention fu-sion network,PAFN).该网络由多个级联的注意力信息融合模块(attention information fusing block,AIFB)构成,同时采用特征交叉融合策略,提取不同网络层中的有用特征信息.每个AIFB结合渐进注意力模块、像素注意力模块和通道注意力模块,使得网络能够在多个维度上提取重要特征信息,有效地提高图像超分重建效果.该方法充分利用不同网络层输出的特征,有效地提高网络性能并减小网络规模.实验结果表明,PAFN可有效提升图像质量,且在客观评价指标和主观视觉效果上均优于其他先进的比较算法,由此证明了PAFN方法的有效性.

Abstract

To address the issue that the network sizes of deep convolutional neural network(CNN)-based image super-resolution methods are too large to be deployed on resource-constrained devices,this paper proposes a lightweight progressive attention fusion network(PAFN).PAFN is composed of multiple cascaded attention information fusing blocks(AIFBs)and employs a feature cross-fusion strategy to extract useful feature information from different network layers.Each AIFB integrates a progressive attention module,a pixel attention module,and a channel attention module,enabling the network to extract significant feature information across multiple dimensions and improve the image super-resolution reconstruction quality.PAFN effectively utilizes features extracted from different network layers to enhance network performance and reduce the network size.Extensive experimen-tal results show that PAFN effectively enhances image quality,and outperforms other advanced com-parison algorithms in both objective evaluation metrics and subjective visual effects,thereby validat-ing the effectiveness of the PAFN method.

关键词

图像超分辨率重建/大核卷积/特征融合/注意力机制

Key words

image super-resolution reconstruction/large-kernel convolution/feature fusion/atten-tion mechanism

引用本文复制引用

出版年

2024
闽南师范大学学报(自然科学版)
漳州师范学院

闽南师范大学学报(自然科学版)

影响因子:0.272
ISSN:1008-7826
段落导航相关论文