首页|Learning an epipolar shift compensation for light field image super-resolution

Learning an epipolar shift compensation for light field image super-resolution

扫码查看
Light field imaging has drawn broad attention since the advent of practical light field capturing systems that facilitate a wide range of applications in computer vision. However, existing learning-based methods for improving the spatial resolution of light field images neglect the shifts in the sub-pixel domain that are widely used by super-resolution techniques, thus, fail in recovering rich high-frequency information. To fully exploit the shift information, our method attempts to learn an epipolar shift compensation for light field image super-resolution that allows the restored light field image to be angular coherent with the enhancement of spatial resolution. The proposed method first utilizes the rich surrounding views along some typical epipolar directions to explore the inter-view correlations. We then implement feature-level registration to capture accurate sub-pixel shifts of central view, which is constructed by the compensation module equipped with dynamic deformable convolution. Finally, the complementary information from different spatial directions is fused to provide high-frequency details for the target view. By taking each sub-aperture image as a central view, our method could be applied for light field images with any angular resolution. Extensive experiments on both synthetic and real scene datasets demonstrate the superiority of our method over the state-of-the-art qualitatively and quantitatively. Moreover, the proposed method shows good performance in preserving the inherent epipolar structures in light field images. Specifically, our LFESCN method outperforms the state-of-the-art method with about 0.7 dB (PSNR) on average.

Light fieldSuper-resolutionMulti-view fusionDynamic deformable convolutionFEATURE FUSIONRESOLUTIONNETWORK

Wang, Xinya、Ma, Jiayi、Yi, Peng、Tian, Xin、Jiang, Junjun、Zhang, Xiao-Ping

展开 >

Wuhan Univ

Harbin Inst Technol

Ryerson Univ

2022

Information Fusion

Information Fusion

EISCI
ISSN:1566-2535
年,卷(期):2022.79
  • 7
  • 64