SOTAVerified

Zero-shot Cross-lingual Transfer is Under-specified Optimization

2021-10-16ACL ARR October 2021Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Pretrained multilingual encoders enable zero-shot cross-lingual transfer performance, but often produce unreliable models that exhibit high performance variance on the target language. We postulate that high variance results from zero-shot cross-lingual transfer solving an under-specified optimization problem. We show that the source language monolingual model and source + target bilingual model are linearly connected using a model interpolation, suggesting that the model struggles to identify good solutions for both source and target languages using the source language alone.

Tasks

Reproductions