SOTAVerified

SMILEE: Symmetric Multi-modal Interactions with Language-gesture Enabled (AI) Embodiment

2018-06-01NAACL 2018Unverified0· sign in to hype

Sujeong Kim, David Salter, Luke DeLuccia, Kilho Son, Mohamed R. Amer, Amir Tamrakar

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We demonstrate an intelligent conversational agent system designed for advancing human-machine collaborative tasks. The agent is able to interpret a user's communicative intent from both their verbal utterances and non-verbal behaviors, such as gestures. The agent is also itself able to communicate both with natural language and gestures, through its embodiment as an avatar thus facilitating natural symmetric multi-modal interactions. We demonstrate two intelligent agents with specialized skills in the Blocks World as use-cases of our system.

Tasks

Reproductions