SOTAVerified

Meta-Learning for Natural Language Understanding under Continual Learning Framework

2020-11-03Code Available0· sign in to hype

Jiacheng Wang, Yong Fan, Duo Jiang, Shiqing Li

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Neural network has been recognized with its accomplishments on tackling various natural language understanding (NLU) tasks. Methods have been developed to train a robust model to handle multiple tasks to gain a general representation of text. In this paper, we implement the model-agnostic meta-learning (MAML) and Online aware Meta-learning (OML) meta-objective under the continual framework for NLU tasks. We validate our methods on selected SuperGLUE and GLUE benchmark.

Tasks

Reproductions