SOTAVerified

Self-Supervised Approach to Addressing Zero-Shot Learning Problem

2022-01-05Code Available0· sign in to hype

Ademola Okerinde, Sam Hoggatt, Divya Vani Lakkireddy, Nolan Brubaker, William Hsu, Lior Shamir, Brian Spiesman

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

In recent years, self-supervised learning has had significant success in applications involving computer vision and natural language processing. The type of pretext task is important to this boost in performance. One common pretext task is the measure of similarity and dissimilarity between pairs of images. In this scenario, the two images that make up the negative pair are visibly different to humans. However, in entomology, species are nearly indistinguishable and thus hard to differentiate. In this study, we explored the performance of a Siamese neural network using contrastive loss by learning to push apart embeddings of bumblebee species pair that are dissimilar, and pull together similar embeddings. Our experimental results show a 61% F1-score on zero-shot instances, a performance showing 11% improvement on samples of classes that share intersections with the training set.

Tasks

Reproductions