SOTAVerified

Gesture Recognition in RGB Videos UsingHuman Body Keypoints and Dynamic Time Warping

2019-06-25Code Available0· sign in to hype

Pascal Schneider, Raphael Memmesheimer, Ivanna Kramer, Dietrich Paulus

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Gesture recognition opens up new ways for humans to intuitively interact with machines. Especially for service robots, gestures can be a valuable addition to the means of communication to, for example, draw the robot's attention to someone or something. Extracting a gesture from video data and classifying it is a challenging task and a variety of approaches have been proposed throughout the years. This paper presents a method for gesture recognition in RGB videos using OpenPose to extract the pose of a person and Dynamic Time Warping (DTW) in conjunction with One-Nearest-Neighbor (1NN) for time-series classification. The main features of this approach are the independence of any specific hardware and high flexibility, because new gestures can be added to the classifier by adding only a few examples of it. We utilize the robustness of the Deep Learning-based OpenPose framework while avoiding the data-intensive task of training a neural network ourselves. We demonstrate the classification performance of our method using a public dataset.

Tasks

Reproductions