SOTAVerified

Towards Understanding the Relation between Gestures and Language

2022-10-01COLING 2022Unverified0· sign in to hype

Artem Abzaliev, Andrew Owens, Rada Mihalcea

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we explore the relation between gestures and language. Using a multimodal dataset, consisting of Ted talks where the language is aligned with the gestures made by the speakers, we adapt a semi-supervised multimodal model to learn gesture embeddings. We show that gestures are predictive of the native language of the speaker, and that gesture embeddings further improve language prediction result. In addition, gesture embeddings might contain some linguistic information, as we show by probing embeddings for psycholinguistic categories. Finally, we analyze the words that lead to the most expressive gestures and find that function words drive the expressiveness of gestures.

Tasks

Reproductions