SOTAVerified

Continual Learning with Neuron Activation Importance

2021-07-27Unverified0· sign in to hype

Sohee Kim, Seungkyu Lee

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Continual learning is a concept of online learning with multiple sequential tasks. One of the critical barriers of continual learning is that a network should learn a new task keeping the knowledge of old tasks without access to any data of the old tasks. In this paper, we propose a neuron activation importance-based regularization method for stable continual learning regardless of the order of tasks. We conduct comprehensive experiments on existing benchmark data sets to evaluate not just the stability and plasticity of our method with improved classification accuracy also the robustness of the performance along the changes of task order.

Tasks

Reproductions