SOTAVerified

SimpleShot: Revisiting Nearest-Neighbor Classification for Few-Shot Learning

2019-11-12Code Available1· sign in to hype

Yan Wang, Wei-Lun Chao, Kilian Q. Weinberger, Laurens van der Maaten

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Few-shot learners aim to recognize new object classes based on a small number of labeled training examples. To prevent overfitting, state-of-the-art few-shot learners use meta-learning on convolutional-network features and perform classification using a nearest-neighbor classifier. This paper studies the accuracy of nearest-neighbor baselines without meta-learning. Surprisingly, we find simple feature transformations suffice to obtain competitive few-shot learning accuracies. For example, we find that a nearest-neighbor classifier used in combination with mean-subtraction and L2-normalization outperforms prior results in three out of five settings on the miniImageNet dataset.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Dirichlet CUB-200 (5-way, 1-shot)Simpleshot1:1 Accuracy70.6Unverified
Dirichlet CUB-200 (5-way, 5-shot)Simpleshot1:1 Accuracy87.5Unverified
Dirichlet Mini-Imagenet (5-way, 1-shot)Simpleshot1:1 Accuracy63Unverified
Dirichlet Mini-Imagenet (5-way, 5-shot)Simpleshot1:1 Accuracy80.1Unverified
Dirichlet Tiered-Imagenet (5-way, 1-shot)Simpleshot1:1 Accuracy69.6Unverified
Dirichlet Tiered-Imagenet (5-way, 5-shot)Simpleshot1:1 Accuracy84.7Unverified
Mini-Imagenet 5-way (1-shot)SimpleShot (CL2N-DenseNet)Accuracy64.29Unverified
Mini-Imagenet 5-way (5-shot)SimpleShot (CL2N-DenseNet)Accuracy81.5Unverified

Reproductions