SOTAVerified

Differentiable Sampling with Flexible Reference Word Order for Neural Machine Translation

2019-04-04NAACL 2019Unverified0· sign in to hype

Weijia Xu, Xing Niu, Marine Carpuat

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Despite some empirical success at correcting exposure bias in machine translation, scheduled sampling algorithms suffer from a major drawback: they incorrectly assume that words in the reference translations and in sampled sequences are aligned at each time step. Our new differentiable sampling algorithm addresses this issue by optimizing the probability that the reference can be aligned with the sampled output, based on a soft alignment predicted by the model itself. As a result, the output distribution at each time step is evaluated with respect to the whole predicted sequence. Experiments on IWSLT translation tasks show that our approach improves BLEU compared to maximum likelihood and scheduled sampling baselines. In addition, our approach is simpler to train with no need for sampling schedule and yields models that achieve larger improvements with smaller beam sizes.

Tasks

Reproductions