基于注意力机制的多尺度融合图像超分辨率重建
Multiscale fusion single image superresolution reconstruction based on attention mechanism
盛月 1辛月兰 1王庆庆 1谢琪琦1
作者信息
- 1. 青海师范大学物理与电子信息工程学院,西宁 810001;藏语智能信息处理及应用国家重点实验室,西宁 810001
- 折叠
摘要
针对图像超分辨率重建算法在信息恢复过程中存在特征提取不充分、重建高频细节能力不足等问题,在SRGAN的基础上提出了一种基于注意力机制的多尺度融合图像超分辨率重建算法(SRGAN-MCA).首先,构建了一种基于坐标注意力机制的多尺度密集残差注意力模块来提取不同尺度的特征信息,以解决图像超分辨率重建非线性映射过程中特征提取不充分的问题;其次,通过在网络判别器中嵌入谱归一化来约束判别器的Lipschitz常数,以增强网络训练的稳定性;最后添加了 Charbonnier损失函数对SRGAN-MCA进行训练优化,以实现更高质量重建.在Set5、Set14、BSD100数据集上的实验结果表明,与SRGAN相比,2倍和4倍放大重建图像的峰值信噪比(PSNR)平均提高了 0.35 dB、0.47 dB,结构相似性(SSIM)平均提高了 0.005 4、0.016.
Abstract
Aiming at the issues of inadequate feature extraction and insufficient ability to reconstruct high-frequen-cy details in the information recovery process of image super-resolution reconstruction algorithm,a multi-scale fused image super-resolution reconstruction algorithm(SRGAN-MCA)based on the attention mechanism is proposed on the basis of SRGAN.First,a multi-scale dense residual attention module based on coordinate attention mechanism is con-structed to extract feature information at different scales to solve the problem of inadequate feature extraction in the process of nonlinear mapping of image super-resolution reconstruction;second,the Lipschitz constant of the discrimi-nator is constrained by embedding spectral normalization in the network discriminator to enhance the stability of net-work training;finally,the Charbonnier loss function to SRGAN-MCA for training optimization to achieve higher quality reconstruction.The experimental results on Set5,Setl4,and BSD 100 datasets show that the peak signal-to-noise rati-o(PSNR)is improved by 0.35 dB and 0.47 dB on average,and the structural similarity(SSIM)is improved by 0.006 and 0.016 on average for the 2 and 4 magnification reconstructed images compared with SRGAN.
关键词
超分辨率重建/生成对抗网络/注意力机制/多尺度特征融合Key words
super resolution reconstruction/generating countermeasures networks/attention mechanism/multi-scale feature fusion引用本文复制引用
基金项目
国家自然科学基金(61662062)
青海省自然科学基金面上项目(2022-ZJ-929)
出版年
2024