SOTAVerified

Few-shot learning using pre-training and shots, enriched by pre-trained samples

2020-09-19Unverified0· sign in to hype

Detlef Schmicker

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We use the EMNIST dataset of handwritten digits to test a simple approach for few-shot learning. A fully connected neural network is pre-trained with a subset of the 10 digits and used for few-shot learning with untrained digits. Two basic ideas are introduced: during few-shot learning the learning of the first layer is disabled, and for every shot a previously unknown digit is used together with four previously trained digits for the gradient descend, until a predefined threshold condition is fulfilled. This way we reach about 90% accuracy after 10 shots.

Tasks

Reproductions