Meta-learning with differentiable closed-form solvers
Luca Bertinetto, João F. Henriques, Philip H. S. Torr, Andrea Vedaldi
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/learnables/learn2learnpytorch★ 2,878
- github.com/icoz69/DeepEMDpytorch★ 604
- github.com/fiveai/on-episodes-fslpytorch★ 35
- github.com/goldblum/AdversarialQueryingpytorch★ 0
- github.com/bertinetto/r2d2pytorch★ 0
Abstract
Adapting deep networks to new concepts from a few examples is challenging, due to the high computational requirements of standard fine-tuning procedures. Most work on few-shot learning has thus focused on simple learning techniques for adaptation, such as nearest neighbours or gradient descent. Nonetheless, the machine learning literature contains a wealth of methods that learn non-deep models very efficiently. In this paper, we propose to use these fast convergent methods as the main adaptation mechanism for few-shot learning. The main idea is to teach a deep network to use standard machine learning tools, such as ridge regression, as part of its own internal model, enabling it to quickly adapt to novel data. This requires back-propagating errors through the solver steps. While normally the cost of the matrix operations involved in such a process would be significant, by using the Woodbury identity we can make the small number of examples work to our advantage. We propose both closed-form and iterative solvers, based on ridge regression and logistic regression components. Our methods constitute a simple and novel approach to the problem of few-shot learning and achieve performance competitive with or superior to the state of the art on three benchmarks.