SOTAVerified

Multi-Head Attention with Disagreement Regularization

2018-10-24EMNLP 2018Unverified0· sign in to hype

Jian Li, Zhaopeng Tu, Baosong Yang, Michael R. Lyu, Tong Zhang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Multi-head attention is appealing for the ability to jointly attend to information from different representation subspaces at different positions. In this work, we introduce a disagreement regularization to explicitly encourage the diversity among multiple attention heads. Specifically, we propose three types of disagreement regularization, which respectively encourage the subspace, the attended positions, and the output representation associated with each attention head to be different from other heads. Experimental results on widely-used WMT14 English-German and WMT17 Chinese-English translation tasks demonstrate the effectiveness and universality of the proposed approach.

Tasks

Reproductions