SOTAVerified

Neural Rendering

Given a representation of a 3D scene of some kind (point cloud, mesh, voxels, etc.), the task is to create an algorithm that can produce photorealistic renderings of this scene from an arbitrary viewpoint. Sometimes, the task is accompanied by image/scene appearance manipulation.

Papers

Showing 8190 of 514 papers

TitleStatusHype
Mode-GS: Monocular Depth Guided Anchored 3D Gaussian Splatting for Robust Ground-View Scene Rendering0
GMT: Enhancing Generalizable Neural Rendering via Geometry-Driven Multi-Reference Texture TransferCode0
OPONeRF: One-Point-One NeRF for Robust Neural RenderingCode0
G3R: Gradient Guided Generalizable Reconstruction0
UniCal: Unified Neural Sensor Calibration0
DeformStream: Deformation-based Adaptive Volumetric Video Streaming0
GSplatLoc: Grounding Keypoint Descriptors into 3D Gaussian Splatting for Improved Visual LocalizationCode2
AIM 2024 Sparse Neural Rendering Challenge: Dataset and Benchmark0
FusionRF: High-Fidelity Satellite Neural Radiance Fields from Multispectral and Panchromatic Acquisitions0
AIM 2024 Sparse Neural Rendering Challenge: Methods and Results0
Show:102550
← PrevPage 9 of 52Next →

No leaderboard results yet.