SOTAVerified

Multi-Source Syntactic Neural Machine Translation

2018-08-30EMNLP 2018Unverified0· sign in to hype

Anna Currey, Kenneth Heafield

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We introduce a novel multi-source technique for incorporating source syntax into neural machine translation using linearized parses. This is achieved by employing separate encoders for the sequential and parsed versions of the same source sentence; the resulting representations are then combined using a hierarchical attention mechanism. The proposed model improves over both seq2seq and parsed baselines by over 1 BLEU on the WMT17 English-German task. Further analysis shows that our multi-source syntactic model is able to translate successfully without any parsed input, unlike standard parsed methods. In addition, performance does not deteriorate as much on long sentences as for the baselines.

Tasks

Reproductions