Improving Few-Shot Learning with Auxiliary Self-Supervised Pretext Tasks
Nathaniel Simard, Guillaume Lagrange
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/nathanielsimard/improving-fs-sslOfficialIn paperpytorch★ 13
Abstract
Recent work on few-shot learning tian2020rethinking showed that quality of learned representations plays an important role in few-shot classification performance. On the other hand, the goal of self-supervised learning is to recover useful semantic information of the data without the use of class labels. In this work, we exploit the complementarity of both paradigms via a multi-task framework where we leverage recent self-supervised methods as auxiliary tasks. We found that combining multiple tasks is often beneficial, and that solving them simultaneously can be done efficiently. Our results suggest that self-supervised auxiliary tasks are effective data-dependent regularizers for representation learning. Our code is available at: https://github.com/nathanielsimard/improving-fs-ssl.