SOTAVerified

Simultaneous Localization and Mapping

Simultaneous localization and mapping (SLAM) is the task of constructing or updating a map of an unknown environment while simultaneously keeping track of an agent's location within it.

( Image credit: ORB-SLAM2 )

Papers

Showing 261270 of 572 papers

TitleStatusHype
Det-SLAM: A semantic visual SLAM for highly dynamic scenes using Detectron2Code0
Closing the Loop: Graph Networks to Unify Semantic Objects and Visual Features for Multi-object Scenes0
LMBAO: A Landmark Map for Bundle Adjustment Odometry in LiDAR SLAM0
OA-SLAM: Leveraging Objects for Camera Relocalization in Visual SLAM0
Semantic Visual Simultaneous Localization and Mapping: A Survey0
Optimizing SLAM Evaluation Footprint Through Dynamic Range Coverage Analysis of Datasets0
LF-VISLAM: A SLAM Framework for Large Field-of-View Cameras with Negative Imaging Plane on Mobile AgentsCode1
General Place Recognition Survey: Towards the Real-world Autonomy AgeCode1
R^3LIVE++: A Robust, Real-time, Radiance reconstruction package with a tightly-coupled LiDAR-Inertial-Visual state EstimatorCode4
Visual Odometry with Neuromorphic Resonator Networks0
Show:102550
← PrevPage 27 of 58Next →

No leaderboard results yet.