SOTAVerified

Alleviating the Inequality of Attention Heads for Neural Machine Translation

2020-09-21COLING 2022Unverified0· sign in to hype

Zewei Sun, Shu-Jian Huang, Xin-yu Dai, Jia-Jun Chen

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Recent studies show that the attention heads in Transformer are not equal. We relate this phenomenon to the imbalance training of multi-head attention and the model dependence on specific heads. To tackle this problem, we propose a simple masking method: HeadMask, in two specific ways. Experiments show that translation improvements are achieved on multiple language pairs. Subsequent empirical analyses also support our assumption and confirm the effectiveness of the method.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
IWSLT2015 Vietnamese-EnglishHeadMask (Random-18)BLEU26.85Unverified
IWSLT2015 Vietnamese-EnglishHeadMask (Impt-18)BLEU26.36Unverified
WMT2016 Romanian-EnglishHeadMask (Impt-18)BLEU score32.95Unverified
WMT2016 Romanian-EnglishHeadMask (Random-18)BLEU score32.85Unverified
WMT2017 Turkish-EnglishHeadMask (Random-18)BLEU score17.56Unverified
WMT2017 Turkish-EnglishHeadMask (Impt-18)BLEU score17.48Unverified

Reproductions