首页|Deep attentive style transfer for images with wavelet decomposition
Deep attentive style transfer for images with wavelet decomposition
扫码查看
点击上方二维码区域,可以放大扫码查看
原文链接
NSTL
Elsevier
To solve the issue of texture preservation in the image style transfer process, this paper presents a novel style transfer method for images that often contain tiny details but are easily noticed by human subjects (e.g., human faces). We aim to achieve content preserving style transfer via an appropriate trade-off between detail preservation and style transfer. To this end, we utilize wavelet transformation with a deep neural network for decoupled style and detail synthesis. Additionally, style transfer should involve a one-toone correspondence of semantic structures of scenes and avoid noticeable unnatural looking style transitions around them. To address the above issue, we leverage an attention mechanism and semantic segmentation for matching and design a novel content loss with local one-to-one correspondence for producing content-preserving stylized results. Finally, we employ wavelet transform to perform feature optimization (FO) to repair some imperfect results. We perform various experiments with Qabf evaluation and a user study to validate our proposed method and show its superiority over state-of-the-art methods for ensemble and texture preservation.(c) 2021 Elsevier Inc. All rights reserved.
Deep learningWavelet transformImage style transferPhotorealistic styleArtistic styleMULTISCALEREMOVAL