SOTAVerified

Dependency-Based Relative Positional Encoding for Transformer NMT

2019-09-01RANLP 2019Unverified0· sign in to hype

Yutaro Omote, Akihiro Tamura, Takashi Ninomiya

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

This paper proposes a new Transformer neural machine translation model that incorporates syntactic distances between two source words into the relative position representations of the self-attention mechanism. In particular, the proposed model encodes pair-wise relative depths on a source dependency tree, which are differences between the depths of the two source words, in the encoder's self-attention. The experiments show that our proposed model achieves 0.5 point gain in BLEU on the Asian Scientific Paper Excerpt Corpus Japanese-to-English translation task.

Tasks

Reproductions