SOTAVerified

Neural Rendering

Given a representation of a 3D scene of some kind (point cloud, mesh, voxels, etc.), the task is to create an algorithm that can produce photorealistic renderings of this scene from an arbitrary viewpoint. Sometimes, the task is accompanied by image/scene appearance manipulation.

Papers

Showing 271280 of 514 papers

TitleStatusHype
SGCNeRF: Few-Shot Neural Rendering via Sparse Geometric Consistency Guidance0
SGD: Street View Synthesis with Gaussian Splatting and Diffusion Prior0
SGSST: Scaling Gaussian Splatting Style Transfer0
Sharpening Your Density Fields: Spiking Neuron Aided Fast Geometry Learning0
SNeS: Learning Probably Symmetric Neural Surfaces from Incomplete Data0
Spacetime Surface Regularization for Neural Dynamic Scene Reconstruction0
SparseFusion: Distilling View-conditioned Diffusion for 3D Reconstruction0
Spatial Broadcast Decoder: A Simple Architecture for Disentangled Representations in VAEs0
SplatArmor: Articulated Gaussian splatting for animatable humans from monocular RGB videos0
SpNeRF: Memory Efficient Sparse Volumetric Neural Rendering Accelerator for Edge Devices0
Show:102550
← PrevPage 28 of 52Next →

No leaderboard results yet.