SOTAVerified

Inverse Rendering

Inverse Rendering is the task of recovering the properties of a scene, such as shape, material, and lighting, from an image or a video. The goal of inverse rendering is to determine the properties of a scene given an observation of it, and to generate new images or videos based on these properties.

Papers

Showing 8190 of 271 papers

TitleStatusHype
Channel-wise Noise Scheduled Diffusion for Inverse Rendering in Indoor Scenes0
GroomLight: Hybrid Inverse Rendering for Relightable Human Hair Appearance Modeling0
SuperCarver: Texture-Consistent 3D Geometry Super-Resolution for High-Fidelity Surface Detail Generation0
Vid2Avatar-Pro: Authentic Avatar from Videos in the Wild via Universal Prior0
GlossGau: Efficient Inverse Rendering for Glossy Surface with Anisotropic Spherical Gaussian0
Generative Detail Enhancement for Physically Based Materials0
OMG: Opacity Matters in Material Modeling with Gaussian Splatting0
NPSim: Nighttime Photorealistic Simulation From Daytime Images With Monocular Inverse Rendering and Ray Tracing0
Multi-view 3D surface reconstruction from SAR images by inverse rendering0
DiffusionRenderer: Neural Inverse and Forward Rendering with Video Diffusion Models0
Show:102550
← PrevPage 9 of 28Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1Neural-PBIRHDR-PSNR26.01Unverified
2NVDiffRecMCHDR-PSNR24.43Unverified
3InvRenderHDR-PSNR23.76Unverified
4NeRFactorHDR-PSNR23.54Unverified
5NeRDHDR-PSNR23.29Unverified
6NVDiffRecHDR-PSNR22.91Unverified
7PhySGHDR-PSNR21.81Unverified