SOTAVerified

Sequence-to-Dependency Neural Machine Translation

2017-07-01ACL 2017Unverified0· sign in to hype

Shuangzhi Wu, Dong-dong Zhang, Nan Yang, Mu Li, Ming Zhou

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Nowadays a typical Neural Machine Translation (NMT) model generates translations from left to right as a linear sequence, during which latent syntactic structures of the target sentences are not explicitly concerned. Inspired by the success of using syntactic knowledge of target language for improving statistical machine translation, in this paper we propose a novel Sequence-to-Dependency Neural Machine Translation (SD-NMT) method, in which the target word sequence and its corresponding dependency structure are jointly constructed and modeled, and this structure is used as context to facilitate word generations. Experimental results show that the proposed method significantly outperforms state-of-the-art baselines on Chinese-English and Japanese-English translation tasks.

Tasks

Reproductions