SOTAVerified

Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels

2019-10-11NeurIPS 2020Code Available1· sign in to hype

Massimiliano Patacchiola, Jack Turner, Elliot J. Crowley, Michael O'Boyle, Amos Storkey

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently, different machine learning methods have been introduced to tackle the challenging few-shot learning scenario that is, learning from a small labeled dataset related to a specific task. Common approaches have taken the form of meta-learning: learning to learn on the new problem given the old. Following the recognition that meta-learning is implementing learning in a multi-level model, we present a Bayesian treatment for the meta-learning inner loop through the use of deep kernels. As a result we can learn a kernel that transfers to new tasks; we call this Deep Kernel Transfer (DKT). This approach has many advantages: is straightforward to implement as a single optimizer, provides uncertainty quantification, and does not require estimation of task-specific parameters. We empirically demonstrate that DKT outperforms several state-of-the-art algorithms in few-shot classification, and is the state of the art for cross-domain adaptation and regression. We conclude that complex meta-learning routines can be replaced by a simpler Bayesian model without loss of accuracy.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CUB 200 5-way 1-shotDKT + BNCosSimAccuracy72.27Unverified
CUB 200 5-way 5-shotDKT + BNCosSimAccuracy85.64Unverified
Mini-Imagenet 5-way (1-shot)DKT + BNCosSimAccuracy62.96Unverified
Mini-Imagenet 5-way (5-shot)DKT + BNCosSimAccuracy64Unverified
Mini-ImageNet-CUB 5-way (1-shot)DKT + CosSimAccuracy40.22Unverified
Mini-ImageNet-CUB 5-way (5-shot)DKT + BNCosSimAccuracy56.4Unverified
OMNIGLOT-EMNIST 5-way (1-shot)DKT + BNCosSimAccuracy75.4Unverified
OMNIGLOT-EMNIST 5-way (5-shot)DKT + BNCosSimAccuracy90.3Unverified

Reproductions