SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 17011725 of 6661 papers

TitleStatusHype
DivCo: Diverse Conditional Image Synthesis via Contrastive Generative Adversarial NetworkCode1
Reconsidering Representation Alignment for Multi-view ClusteringCode1
WenLan: Bridging Vision and Language by Large-Scale Multi-Modal Pre-TrainingCode1
3D Human Pose, Shape and Texture from Low-Resolution Images and VideosCode1
Spatially Consistent Representation LearningCode1
VideoMoCo: Contrastive Video Representation Learning with Temporally Adversarial ExamplesCode1
FSCE: Few-Shot Object Detection via Contrastive Proposal EncodingCode1
Doubly Contrastive Deep ClusteringCode1
SimTriplet: Simple Triplet Representation Learning with a Single GPUCode1
Self-Supervised Longitudinal Neighbourhood EmbeddingCode1
Constrained Contrastive Distribution Learning for Unsupervised Anomaly Detection and Localisation in Medical ImagesCode1
SoundCLR: Contrastive Learning of Representations For Improved Environmental Sound ClassificationCode1
Task-Adaptive Neural Network Search with Meta-Contrastive LearningCode1
Partially View-aligned Representation Learning with Noise-robust Contrastive LossCode1
Panoramic Panoptic Segmentation: Towards Complete Surrounding Understanding via Unsupervised Contrastive LearningCode1
Anomaly Detection on Attributed Networks via Contrastive Self-Supervised LearningCode1
Learning Disentangled Representation by Exploiting Pretrained Generative Models: A Contrastive Learning ViewCode1
On Fast Adversarial Robustness Adaptation in Model-Agnostic Meta-LearningCode1
Molecular Contrastive Learning of Representations via Graph Neural NetworksCode1
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution CalibrationCode1
Contrastive Learning Inverts the Data Generating ProcessCode1
COCO-LM: Correcting and Contrasting Text Sequences for Language Model PretrainingCode1
Large-Scale Representation Learning on Graphs via BootstrappingCode1
Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-TuningCode1
Negative Data AugmentationCode1
Show:102550
← PrevPage 69 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified