SOTAVerified

MAML-CL: Edited Model-Agnostic Meta-Learning for Continual Learning

2021-11-16ACL ARR November 2021Code Available0· sign in to hype

Anonymous

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Continual learning (CL) exhibits a learning ability to well-learn all sequentially seen tasks drawn from various domains. Yet, existing sequential training methods fail to consolidate learned knowledge from earlier tasks due to data distribution shifts, hereby leading to catastrophic forgetting. We devise an optimization-based meta learning framework for CL in accordance with MAML, where query samples are edited for generalization of learned knowledge. We conduct extensive experiments on text classification in a low resource CL setup, where we downsize training set to its 10%. The experimental results demonstrate the superiority of our method in terms of stability, fast adaptation, memory efficiency and knowledge retention across various domains.

Tasks

Reproductions