Synchronous Syntactic Attention for Transformer Neural Machine Translation
2021-08-01ACL 2021Unverified0· sign in to hype
Hiroyuki Deguchi, Akihiro Tamura, Takashi Ninomiya
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
This paper proposes a novel attention mechanism for Transformer Neural Machine Translation, ``Synchronous Syntactic Attention,'' inspired by synchronous dependency grammars. The mechanism synchronizes source-side and target-side syntactic self-attentions by minimizing the difference between target-side self-attentions and the source-side self-attentions mapped by the encoder-decoder attention matrix. The experiments show that the proposed method improves the translation performance on WMT14 En-De, WMT16 En-Ro, and ASPEC Ja-En (up to +0.38 points in BLEU).