SOTAVerified

Neural Machine Translation with Reordering Embeddings

2019-07-01ACL 2019Unverified0· sign in to hype

Kehai Chen, Rui Wang, Masao Utiyama, Eiichiro Sumita

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The reordering model plays an important role in phrase-based statistical machine translation. However, there are few works that exploit the reordering information in neural machine translation. In this paper, we propose a reordering mechanism to learn the reordering embedding of a word based on its contextual information. These learned reordering embeddings are stacked together with self-attention networks to learn sentence representation for machine translation. The reordering mechanism can be easily integrated into both the encoder and the decoder in the Transformer translation system. Experimental results on WMT'14 English-to-German, NIST Chinese-to-English, and WAT Japanese-to-English translation tasks demonstrate that the proposed methods can significantly improve the performance of the Transformer.

Tasks

Reproductions