SOTAVerified

Meta-Consolidation for Continual Learning

2020-10-01NeurIPS 2020Code Available1· sign in to hype

K J Joseph, Vineeth N. Balasubramanian

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The ability to continuously learn and adapt itself to new tasks, without losing grasp of already acquired knowledge is a hallmark of biological learning systems, which current deep learning systems fall short of. In this work, we present a novel methodology for continual learning called MERLIN: Meta-Consolidation for Continual Learning. We assume that weights of a neural network , for solving task t, come from a meta-distribution p(|t). This meta-distribution is learned and consolidated incrementally. We operate in the challenging online continual learning setting, where a data point is seen by the model only once. Our experiments with continual learning benchmarks of MNIST, CIFAR-10, CIFAR-100 and Mini-ImageNet datasets show consistent improvement over five baselines, including a recent state-of-the-art, corroborating the promise of MERLIN.

Tasks

Reproductions