SOTAVerified

Adapting BigScience Multilingual Model to Unseen Languages

2022-04-11Unverified0· sign in to hype

Zheng-Xin Yong, Vassilina Nikoulina

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We benchmark different strategies of adding new languages (German and Korean) into the BigScience's pretrained multilingual language model with 1.3 billion parameters that currently supports 13 languages. We investigate the factors that affect the language adaptability of the model and the trade-offs between computational costs and expected performance.

Tasks

Reproductions