Can Monolingual Pretrained Models Help Cross-Lingual Classification?
2019-11-10Asian Chapter of the Association for Computational LinguisticsUnverified0· sign in to hype
Zewen Chi, Li Dong, Furu Wei, Xian-Ling Mao, He-Yan Huang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
Multilingual pretrained language models (such as multilingual BERT) have achieved impressive results for cross-lingual transfer. However, due to the constant model capacity, multilingual pre-training usually lags behind the monolingual competitors. In this work, we present two approaches to improve zero-shot cross-lingual classification, by transferring the knowledge from monolingual pretrained models to multilingual ones. Experimental results on two cross-lingual classification benchmarks show that our methods outperform vanilla multilingual fine-tuning.