SOTAVerified

Essentials for Class Incremental Learning

2021-02-18Code Available1· sign in to hype

Sudhanshu Mittal, Silvio Galesso, Thomas Brox

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Contemporary neural networks are limited in their ability to learn from evolving streams of training data. When trained sequentially on new or evolving tasks, their accuracy drops sharply, making them unsuitable for many real-world applications. In this work, we shed light on the causes of this well-known yet unsolved phenomenon - often referred to as catastrophic forgetting - in a class-incremental setup. We show that a combination of simple components and a loss that balances intra-task and inter-task learning can already resolve forgetting to the same extent as more complex measures proposed in literature. Moreover, we identify poor quality of the learned representation as another reason for catastrophic forgetting in class-IL. We show that performance is correlated with secondary class information (dark knowledge) learned by the model and it can be improved by an appropriate regularizer. With these lessons learned, class-incremental learning results on CIFAR-100 and ImageNet improve over the state-of-the-art by a large margin, while keeping the approach simple.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CIFAR-100 - 50 classes + 10 steps of 5 classesCCIL-SDAverage Incremental Accuracy65.86Unverified
CIFAR-100 - 50 classes + 5 steps of 10 classesCCIL-SDAverage Incremental Accuracy67.17Unverified
ImageNet-100 - 50 classes + 10 steps of 5 classesCCIL-SDAverage Incremental Accuracy76.77Unverified
ImageNet-100 - 50 classes + 5 steps of 10 classesCCIL-SDAverage Incremental Accuracy79.44Unverified
ImageNet - 500 classes + 5 steps of 100 classesCCIL-SDAverage Incremental Accuracy68.04Unverified

Reproductions