SOTAVerified

TRAM: Global Trajectory and Motion of 3D Humans from in-the-wild Videos

2024-03-26Code Available4· sign in to hype

Yufu Wang, ZiYun Wang, Lingjie Liu, Kostas Daniilidis

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose TRAM, a two-stage method to reconstruct a human's global trajectory and motion from in-the-wild videos. TRAM robustifies SLAM to recover the camera motion in the presence of dynamic humans and uses the scene background to derive the motion scale. Using the recovered camera as a metric-scale reference frame, we introduce a video transformer model (VIMO) to regress the kinematic body motion of a human. By composing the two motions, we achieve accurate recovery of 3D humans in the world space, reducing global motion errors by a large margin from prior work. https://yufu-wang.github.io/tram4d/

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
EMDBTRAMAverage MPJPE-PA (mm)45.7Unverified

Reproductions