Spatial Contrastive Learning for Few-Shot Classification
Yassine Ouali, Céline Hudelot, Myriam Tami
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/yassouali/SCLOfficialIn paperpytorch★ 50
Abstract
In this paper, we explore contrastive learning for few-shot classification, in which we propose to use it as an additional auxiliary training objective acting as a data-dependent regularizer to promote more general and transferable features. In particular, we present a novel attention-based spatial contrastive objective to learn locally discriminative and class-agnostic features. As a result, our approach overcomes some of the limitations of the cross-entropy loss, such as its excessive discrimination towards seen classes, which reduces the transferability of features to unseen classes. With extensive experiments, we show that the proposed method outperforms state-of-the-art approaches, confirming the importance of learning good and transferable embeddings for few-shot learning.