SOTAVerified

Light Field View Synthesis via Aperture Disparity and Warping Confidence Map

2020-09-07Unverified0· sign in to hype

Nan Meng, Kai Li, Jianzhuang Liu, Edmund Y. Lam

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper presents a learning-based approach to synthesize the view from an arbitrary camera position given a sparse set of images. A key challenge for this novel view synthesis arises from the reconstruction process, when the views from different input images may not be consistent due to obstruction in the light path. We overcome this by jointly modeling the epipolar property and occlusion in designing a convolutional neural network. We start by defining and computing the aperture disparity map, which approximates the parallax and measures the pixel-wise shift between two views. While this relates to free-space rendering and can fail near the object boundaries, we further develop a warping confidence map to address pixel occlusion in these challenging regions. The proposed method is evaluated on diverse real-world and synthetic light field scenes, and it shows better performance over several state-of-the-art techniques.

Tasks

Reproductions