SOTAVerified

Monolingual Adapters for Zero-Shot Neural Machine Translation

2020-11-01EMNLP 2020Unverified0· sign in to hype

Jerin Philip, Alexandre Berard, Matthias Gall{\'e}, Laurent Besacier

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a novel adapter layer formalism for adapting multilingual models. They are more parameter-efficient than existing adapter layers while obtaining as good or better performance. The layers are specific to one language (as opposed to bilingual adapters) allowing to compose them and generalize to unseen language-pairs. In this zero-shot setting, they obtain a median improvement of +2.77 BLEU points over a strong 20-language multilingual Transformer baseline trained on TED talks.

Tasks

Reproductions