首页|基于并行反向投影的图像超分辨率

基于并行反向投影的图像超分辨率

扫码查看
基于反向投影的残差特征提取与融合,可有效提升深度网络的特征提取能力,从而有益于改善图像的超分辨率重构性能.在此基础上提出了一种改进的采用并行反向投影策略的图像超分辨率深度网络,通过并行增强处于不同频段的高频特征,得到超分辨率性能的进一步提升.具体进行浅层特征提取后,网络经过多级的双路并行的反向投影特征增强模块.每一级模块中包含两个通路,分别采用顺序相反的上下采样,可同时得到处于不同频段的残差特征信息.通过对多级残差特征的融合,图像的高频特征得到不断的增强.同时网络引入了多尺度特征提取与通道注意力机制,以改进特征表达和学习能力.在多个公开的数据集上的大量实验结果表明,该方法可以有效提升超分辨率性能,并且在减少模型复杂度方面有一定的成效.
Image super-resolution based on parallel back-projection
Through the extraction and fusion of residual information with back projection,it can effectively improve the feature extraction capability of deep network,which is beneficial to improve the super-resolution reconstruction performance of image.An improved image super-resolution deep network based on parallel back-projection strategy is proposed,which can further promote the super-resolution performance by enhancing the high-frequency features of different bands in parallel.Specifically,after the shallow feature extraction,the network passes through a multilevel dual-channel parallel back-projection feature enhanced module.Each level of module contains two channels,which respectively use up and down sampling in opposite order to obtain residual feature in different bands at the same time.The high frequency features of the image are enhanced by the fusion of multilevel residual features.Multi-scale feature extraction and channel attention are introduced in the network to improve feature expression and learning ability.A large number of experiments on several public datasets show that the proposed method is effective in improving super-resolution performance and reducing model complexity.

single image super resolutiondeep learningdual channel back-projectionattention mechanismfeature fusion

熊承义、李雪静、高志荣、孙清清、刘川鄂

展开 >

中南民族大学电子信息工程学院, 武汉 430074

中南民族大学智能无线通信湖北省重点实验室, 武汉 430074

中南民族大学计算机科学学院, 武汉 430074

单图像超分辨率 深度网络 并行反向投影 多尺度特征 注意力机制

多谱信息处理技术国家重点实验室基金资助项目中南民族大学重点教研项目中南民族大学研究生课程思政示范课程项目中央高校基本科研业务费专项资金资助项目

6142113210303JYZD20020YJS22039CZY21013

2024

中南民族大学学报(自然科学版)
中南民族大学

中南民族大学学报(自然科学版)

影响因子:0.536
ISSN:1672-4321
年,卷(期):2024.43(1)
  • 26