SOTAVerified

Absolute Pose from One or Two Scaled and Oriented Features

2024-01-01CVPR 2024Code Available1· sign in to hype

Jonathan Ventura, Zuzana Kukelova, Torsten Sattler, Dániel Baráth

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Keypoints used for image matching often include an estimate of the feature scale and orientation. While recent work has demonstrated the advantages of using feature scales and orientations for relative pose estimation relatively little work has considered their use for absolute pose estimation. We introduce minimal solutions for absolute pose from two oriented feature correspondences in the general case or one scaled and oriented correspondence given a known vertical direction. Nowadays assuming a known direction is not particularly restrictive as modern consumer devices such as smartphones or drones are equipped with Inertial Measurement Units (IMU) that provide the gravity direction by default. Compared to traditional absolute pose methods requiring three point correspondences our solvers need a smaller minimal sample reducing the cost and complexity of robust estimation. Evaluations on large-scale and public real datasets demonstrate the advantage of our methods for fast and accurate localization in challenging conditions. Code is available at https://github.com/danini/absolute-pose-from-oriented-and-scaled-features .

Tasks

Reproductions