SOTAVerified

Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding

2019-04-20Asian Chapter of the Association for Computational LinguisticsCode Available0· sign in to hype

Guokan Shang, Antoine Jean-Pierre Tixier, Michalis Vazirgiannis, Jean-Pierre Lorré

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Abstractive community detection is an important spoken language understanding task, whose goal is to group utterances in a conversation according to whether they can be jointly summarized by a common abstractive sentence. This paper provides a novel approach to this task. We first introduce a neural contextual utterance encoder featuring three types of self-attention mechanisms. We then train it using the siamese and triplet energy-based meta-architectures. Experiments on the AMI corpus show that our system outperforms multiple energy-based and non-energy based baselines from the state-of-the-art. Code and data are publicly available.

Tasks

Reproductions