SOTAVerified

Continual Learning in Open-vocabulary Classification with Complementary Memory Systems

2023-07-04Code Available0· sign in to hype

Zhen Zhu, Weijie Lyu, Yao Xiao, Derek Hoiem

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce a method for flexible and efficient continual learning in open-vocabulary image classification, drawing inspiration from the complementary learning systems observed in human cognition. Specifically, we propose to combine predictions from a CLIP zero-shot model and the exemplar-based model, using the zero-shot estimated probability that a sample's class is within the exemplar classes. We also propose a "tree probe" method, an adaption of lazy learning principles, which enables fast learning from new examples with competitive accuracy to batch-trained linear models. We test in data incremental, class incremental, and task incremental settings, as well as ability to perform flexible inference on varying subsets of zero-shot and learned categories. Our proposed method achieves a good balance of learning speed, target task effectiveness, and zero-shot effectiveness. Code will be available at https://github.com/jessemelpolio/TreeProbe.

Tasks

Reproductions