SOTAVerified

Incremental Decoding and Training Methods for Simultaneous Translation in Neural Machine Translation

2018-06-10NAACL 2018Unverified0· sign in to hype

Fahim Dalvi, Nadir Durrani, Hassan Sajjad, Stephan Vogel

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We address the problem of simultaneous translation by modifying the Neural MT decoder to operate with dynamically built encoder and attention. We propose a tunable agent which decides the best segmentation strategy for a user-defined BLEU loss and Average Proportion (AP) constraint. Our agent outperforms previously proposed Wait-if-diff and Wait-if-worse agents (Cho and Esipova, 2016) on BLEU with a lower latency. Secondly we proposed data-driven changes to Neural MT training to better match the incremental decoding framework.

Tasks

Reproductions