SOTAVerified

Multilingual Domain Adaptation for NMT: Decoupling Language and Domain Information with Adapters

2021-05-16ACL ARR May 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Adapter layers are lightweight, learnable units inserted between transformer layers. Recent work explores using such layers for neural machine translation (NMT), to adapt pre-trained models to new domains or language pairs. We propose strategies to compose language and domain adapters. Our goals are both parameter-efficient adaptation to multiple domains and languages simultaneously, and cross-lingual transfer in domains where parallel data is unavailable for certain language pairs. We find that a naive combination of domain-specific and language-specific adapters often results in translations into the wrong language. We study other ways to combine the adapters to alleviate this issue and maximize cross-lingual transfer. With our best adapter combinations, we obtain improvements of 3-4 BLEU on average for source languages that do not have in-domain data. For target languages without in-domain data, we achieve a similar improvement by combining adapters with back-translation.

Tasks

Reproductions