SOTAVerified

Simultaneous Localization and Mapping

Simultaneous localization and mapping (SLAM) is the task of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it.

( Image credit: ORB-SLAM2 )

Papers

Showing 4150 of 572 papers

TitleStatusHype
Monocular visual simultaneous localization and mapping: (r)evolution from geometry to deep learning-based pipelinesCode1
Query Quantized Neural SLAMCode1
NeRF and Gaussian Splatting SLAM in the WildCode1
ROVER: A Multi-Season Dataset for Visual SLAMCode1
LCP-Fusion: A Neural Implicit SLAM with Enhanced Local Constraints and Computable PriorCode1
LGU-SLAM: Learnable Gaussian Uncertainty Matching with Deformable Correlation Sampling for Deep Visual SLAMCode1
NYC-Event-VPR: A Large-Scale High-Resolution Event-Based Visual Place Recognition Dataset in Dense Urban EnvironmentsCode1
QueensCAMP: an RGB-D dataset for robust Visual SLAMCode1
V3D-SLAM: Robust RGB-D SLAM in Dynamic Environments with 3D Semantic Geometry VotingCode1
BodySLAM: A Generalized Monocular Visual SLAM Framework for Surgical ApplicationsCode1
Show:102550
← PrevPage 5 of 58Next →

No leaderboard results yet.