摘要
目前,基于深度学习的超分辨算法已经取得了很好性能,但这些方法通常具有较大内存消耗和较高计算复杂度,很难应用到低算力或便携式设备上.为了解决这个问题,设计一种轻量级的组-信息蒸馏残差网络(Group-information dis-tillation residual network,G-IDRN)用于快速且精确的单图像超分辨率任务.具体地,提出一个更加有效的组-信息蒸馏模块(Group-information distillation block,G-IDB)作为网络特征提取基本块.同时,引入密集快捷连接,对多个基本块进行组合,构建组-信息蒸馏残差组(Group-information distillation residual group,G-IDRG),捕获多层级信息和有效重利用特征.另外,还提出一个轻量的非对称残差Non-local模块,对长距离依赖关系进行建模,进一步提升超分性能.最后,设计一个高频损失函数,去解决像素损失带来图像细节平滑的问题.大量实验结果表明,该算法相较于其他先进方法,可以在图像超分辨率性能和模型复杂度之间取得更好平衡,其在公开测试数据集B100上,4倍超分速率达到56 FPS,比残差注意力网络快15倍.
Abstract
Recently,most super-resolution algorithms based on deep learning have achieved satisfactory results.However,these methods generally consume large memory and have high computational complexity,and are diffi-cult to apply to low computing power or portable devices.To address this problem,this paper introduces a light-weight group-information distillation residual network(G-IDRN)for fast and accurate single image super-resolution.Specially,we propose a more effective group-information distillation block(G-IDB)as the basic block for feature ex-traction.Simultaneously,we introduce dense shortcut to combine them to construct a group-information distilla-tion residual group(G-IDRG),which is used to capture multi-level information and effectively reuse the learned fea-tures.Moreover,a lightweight asymmetric residual Non-local block is proposed to model the long-range dependen-cies and further improve the performance of super-resolution.Finally,a high-frequency loss function is designed to alleviate the problem of smoothing image details caused by pixel-wise loss.Extensive experiments show the pro-posed algorithm achieves a better trade-off between image super-resolution performance and model complexity against other state-of-the-art super-resolution methods and gets 56 FPS on the public test dataset B100 with a scale factor of 4 times,which is 15 times faster than the residual channel attention network.