SOTAVerified

Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding

2021-11-10Unverified0· sign in to hype

Qianying Liu, Fei Cheng, Sadao Kurohashi

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Meta learning with auxiliary languages has demonstrated promising improvements for cross-lingual natural language processing. However, previous studies sample the meta-training and meta-testing data from the same language, which limits the ability of the model for cross-lingual transfer. In this paper, we propose XLA-MAML, which performs direct cross-lingual adaption in the meta-learning stage. We conduct zero-shot and few-shot experiments on Natural Language Inference and Question Answering. The experimental results demonstrate the effectiveness of our method across different languages, tasks, and pretrained models. We also give analysis on various cross-lingual specific settings for meta-learning including sampling strategy and parallelism.

Tasks

Reproductions