首页|基于感知推理和外部空间先验特征的图像修复

基于感知推理和外部空间先验特征的图像修复

扫码查看
在基于深度学习的图像修复算法中,当存在大面积掩码时,由于缺乏合理的先验信息指导,修复结果往往会出现伪影和模糊纹理等现象.针对此问题,提出将先验特征与图像预测滤波相结合的图像修复算法.该算法包含两个分支:图像滤波核预测分支和特征推理与图像滤波分支.从图像滤波核预测分支的解码器部分提取特征,利用多尺度外部空间特征融合对掩码区域特征进行重建,并传递给另一分支的解码阶段作为先验特征,为图像修复提供更为丰富的语义信息.然后,在特征推理和图像滤波分支部分引入空间特征感知推理块,它能够过滤掉分散注意力的特征,同时捕捉信息丰富的远距离图像上下文进行推理.最后,使用图像预测滤波核进行过滤消除伪影.在CelebA和Places2数据集上与其他修复网络进行对比实验,证明了该方法在修复质量上的优越性.
Image Inpainting Based on Perceptual Inference and External Spatial Prior Features
Image inpainting based on deep learning has made a lot of remarkable progress.However,when there is a large area mask,due to the lack of reasonable prior information guidance,the repair results often appear artifacts and blurred textures.Therefore,we propose an image inpainting algorithm that combines prior features with image predictive filtering.It consists of two branches:Image filtering kernel prediction branch and feature inference and image filtering branch.The features are extracted from the decoder part of the image filter kernel prediction branch.The multi-scale external spatial feature fusion is used to reconstruct the mask region features,and the decoding stage is passed to another branch as a prior feature to provide richer semantic information for image inpainting.Then,a spatial feature-aware inference block is introduced in the feature inference and image filtering branches,which can filter out the distracting features and capture the informative long-distance image context for inference.Finally,the image prediction filter kernel is used to filter and eliminate artifacts.Compared with other repair networks on CelebA and Places2 datasets,the superiority of the method in repair quality is proved.

image inpaintingprior featuresimage predictive filteringfeature perception reasoningexternal spatial feature fusion

吴鹏、张孙杰、王永雄、陈远峰、覃海旺

展开 >

上海理工大学光电信息与计算机工程学院,上海 200093

图像修复 先验特征 图像预测滤波 特征感知推理 外部空间特征融合

上海市晨光学者基金上海市自然科学基金

18CG5222ZR1443700

2024

数据采集与处理
中国电子学会 中国仪器仪表学会信号处理学会 中国仪器仪表学会中国物理学会微弱信号检测学会 南京航空航天大学

数据采集与处理

CSTPCD北大核心
影响因子:0.679
ISSN:1004-9037
年,卷(期):2024.39(4)