SOTAVerified

Enabling Continual Learning in Neural Networks with Meta Learning

2019-05-16ICML Workshop AMTL 2019Unverified0· sign in to hype

Anonymous

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Catastrophic forgetting in neural networks is one of the most well-known problems in continual learning. Previous attempts on addressing the problem focus on preventing important weights from changing. Such methods often require task boundaries to learn effectively and do not support backward transfer learning. In this paper, we propose a meta-learning algorithm which learns to reconstruct the gradients of old tasks w.r.t. the current parameters and combines these reconstructed gradients with the current gradient to enable continual learning and backward transfer learning from the current task to previous tasks. Experiments on standard continual learning benchmarks show that our algorithm can effectively prevent catastrophic forgetting and supports backward transfer learning.

Tasks

Reproductions