SOTAVerified

Bi-Directional Differentiable Input Reconstruction for Low-Resource Neural Machine Translation

2018-11-02NAACL 2019Code Available0· sign in to hype

Xing Niu, Weijia Xu, Marine Carpuat

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We aim to better exploit the limited amounts of parallel text available in low-resource settings by introducing a differentiable reconstruction loss for neural machine translation (NMT). This loss compares original inputs to reconstructed inputs, obtained by back-translating translation hypotheses into the input language. We leverage differentiable sampling and bi-directional NMT to train models end-to-end, without introducing additional parameters. This approach achieves small but consistent BLEU improvements on four language pairs in both translation directions, and outperforms an alternative differentiable reconstruction strategy based on hidden states.

Tasks

Reproductions