SOTAVerified

A Neural Attention Model for Disfluency Detection

2016-12-01COLING 2016Unverified0· sign in to hype

Shaolei Wang, Wanxiang Che, Ting Liu

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper, we study the problem of disfluency detection using the encoder-decoder framework. We treat disfluency detection as a sequence-to-sequence problem and propose a neural attention-based model which can efficiently model the long-range dependencies between words and make the resulting sentence more likely to be grammatically correct. Our model firstly encode the source sentence with a bidirectional Long Short-Term Memory (BI-LSTM) and then use the neural attention as a pointer to select an ordered sub sequence of the input as the output. Experiments show that our model achieves the state-of-the-art f-score of 86.7\% on the commonly used English Switchboard test set. We also evaluate the performance of our model on the in-house annotated Chinese data and achieve a significantly higher f-score compared to the baseline of CRF-based approach.

Tasks

Reproductions