SOTAVerified

Improving Neural Machine Translation with Neural Syntactic Distance

2019-06-01NAACL 2019Unverified0· sign in to hype

Chunpeng Ma, Akihiro Tamura, Masao Utiyama, Eiichiro Sumita, Tiejun Zhao

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The explicit use of syntactic information has been proved useful for neural machine translation (NMT). However, previous methods resort to either tree-structured neural networks or long linearized sequences, both of which are inefficient. Neural syntactic distance (NSD) enables us to represent a constituent tree using a sequence whose length is identical to the number of words in the sentence. NSD has been used for constituent parsing, but not in machine translation. We propose five strategies to improve NMT with NSD. Experiments show that it is not trivial to improve NMT with NSD; however, the proposed strategies are shown to improve translation performance of the baseline model (+2.1 (En--Ja), +1.3 (Ja--En), +1.2 (En--Ch), and +1.0 (Ch--En) BLEU).

Tasks

Reproductions