SOTAVerified

A Hand Motion-guided Articulation and Segmentation Estimation

2020-05-07Code Available0· sign in to hype

Richard Sahala Hartanto, Ryoichi Ishikawa, Menandro Roxas, Takeshi Oishi

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In this paper, we present a method for simultaneous articulation model estimation and segmentation of an articulated object in RGB-D images using human hand motion. Our method uses the hand motion in the processes of the initial articulation model estimation, ICP-based model parameter optimization, and region selection of the target object. The hand motion gives an initial guess of the articulation model: prismatic or revolute joint. The method estimates the joint parameters by aligning the RGB-D images with the constraint of the hand motion. Finally, the target regions are selected from the cluster regions which move symmetrically along with the articulation model. Our experimental results show the robustness of the proposed method for the various objects.

Tasks

Reproductions