SOTAVerified

Inverse Rendering

Inverse Rendering is the task of recovering the properties of a scene, such as shape, material, and lighting, from an image or a video. The goal of inverse rendering is to determine the properties of a scene given an observation of it, and to generate new images or videos based on these properties.

Papers

Showing 111120 of 271 papers

TitleStatusHype
Differentiable Point-based Inverse Rendering0
The Sky's the Limit: Re-lightable Outdoor Scenes via a Sky-pixel Constrained Illumination Prior and Outside-In VisibilityCode1
GS-IR: 3D Gaussian Splatting for Inverse RenderingCode2
NeISF: Neural Incident Stokes Field for Geometry and Material Estimation0
Intrinsic Image Decomposition via Ordinal ShadingCode2
Virtual Home Staging: Inverse Rendering and Editing an Indoor Panorama under Natural IlluminationCode0
NePF: Neural Photon Field for Single-Stage Inverse Rendering0
Holistic Inverse Rendering of Complex Facade via Aerial 3D Scanning0
RENI++ A Rotation-Equivariant, Scale-Invariant, Natural Illumination PriorCode1
Single-Image 3D Human Digitization with Shape-Guided Diffusion0
Show:102550
← PrevPage 12 of 28Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Neural-PBIRHDR-PSNR26.01Unverified
2NVDiffRecMCHDR-PSNR24.43Unverified
3InvRenderHDR-PSNR23.76Unverified
4NeRFactorHDR-PSNR23.54Unverified
5NeRDHDR-PSNR23.29Unverified
6NVDiffRecHDR-PSNR22.91Unverified
7PhySGHDR-PSNR21.81Unverified