SOTAVerified

Variational Disparity Estimation Framework for Plenoptic Image

2018-04-18Code Available0· sign in to hype

Trung-Hieu Tran, Zhe Wang, Sven Simon

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

This paper presents a computational framework for accurately estimating the disparity map of plenoptic images. The proposed framework is based on the variational principle and provides intrinsic sub-pixel precision. The light-field motion tensor introduced in the framework allows us to combine advanced robust data terms as well as provides explicit treatments for different color channels. A warping strategy is embedded in our framework for tackling the large displacement problem. We also show that by applying a simple regularization term and a guided median filtering, the accuracy of displacement field at occluded area could be greatly enhanced. We demonstrate the excellent performance of the proposed framework by intensive comparisons with the Lytro software and contemporary approaches on both synthetic and real-world datasets.

Tasks

Reproductions