SOTAVerified

SAM2MOT: A Novel Paradigm of Multi-Object Tracking by Segmentation

2025-04-06Code Available2· sign in to hype

Junjie Jiang, Zelin Wang, Manqi Zhao, Yin Li, Dongsheng Jiang

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Segment Anything 2 (SAM2) enables robust single-object tracking using segmentation. To extend this to multi-object tracking (MOT), we propose SAM2MOT, introducing a novel Tracking by Segmentation paradigm. Unlike Tracking by Detection or Tracking by Query, SAM2MOT directly generates tracking boxes from segmentation masks, reducing reliance on detection accuracy. SAM2MOT has two key advantages: zero-shot generalization, allowing it to work across datasets without fine-tuning, and strong object association, inherited from SAM2. To further improve performance, we integrate a trajectory manager system for precise object addition and removal, and a cross-object interaction module to handle occlusions. Experiments on DanceTrack, UAVDT, and BDD100K show state-of-the-art results. Notably, SAM2MOT outperforms existing methods on DanceTrack by +2.1 HOTA and +4.5 IDF1, highlighting its effectiveness in MOT. Code is available at https://github.com/TripleJoy/SAM2MOT.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
DanceTrackSAM2MOTHOTA75.9Unverified
UAVDTSAM2MOTIDF174.4Unverified

Reproductions