SOTAVerified

Modeling with Recurrent Neural Networks for Open Vocabulary Slots

2018-08-01COLING 2018Unverified0· sign in to hype

Jun-Seong Kim, Junghoe Kim, SeungUn Park, Kwangyong Lee, Yoonju Lee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Dealing with `open-vocabulary' slots has been among the challenges in the natural language area. While recent studies on attention-based recurrent neural network (RNN) models have performed well in completing several language related tasks such as spoken language understanding and dialogue systems, there has been a lack of attempts to address filling slots that take on values from a virtually unlimited set. In this paper, we propose a new RNN model that can capture the vital concept: Understanding the role of a word may vary according to how long a reader focuses on a particular part of a sentence. The proposed model utilizes a long-term aware attention structure, positional encoding primarily considering the relative distance between words, and multi-task learning of a character-based language model and an intent detection model. We show that the model outperforms the existing RNN models with respect to discovering `open-vocabulary' slots without any external information, such as a named entity database or knowledge base. In particular, we confirm that it performs better with a greater number of slots in a dataset, including unknown words, by evaluating the models on a dataset of several domains. In addition, the proposed model also demonstrates superior performance with regard to intent detection.

Tasks

Reproductions