SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 62516275 of 6661 papers

TitleStatusHype
Adversarial Examples can be Effective Data Augmentation for Unsupervised Machine LearningCode0
Partially View-aligned Representation Learning with Noise-robust Contrastive LossCode1
Panoramic Panoptic Segmentation: Towards Complete Surrounding Understanding via Unsupervised Contrastive LearningCode1
Fool Me Once: Robust Selective Segmentation via Out-of-Distribution Detection with Contrastive Learning0
Using contrastive learning to improve the performance of steganalysis schemes0
Anomaly Detection on Attributed Networks via Contrastive Self-Supervised LearningCode1
Consistent Assignment for Representation Learning0
A Primer on Contrastive Pretraining in Language Processing: Methods, Lessons Learned and Perspectives0
Towards Robust Graph Contrastive Learning0
Provably Improved Context-Based Offline Meta-RL with Attention and Contrastive Learning0
MedAug: Contrastive learning leveraging patient metadata improves representations for chest X-ray interpretation0
Learning Disentangled Representation by Exploiting Pretrained Generative Models: A Contrastive Learning ViewCode1
On Fast Adversarial Robustness Adaptation in Model-Agnostic Meta-LearningCode1
Molecular Contrastive Learning of Representations via Graph Neural NetworksCode1
CUPR: Contrastive Unsupervised Learning for Person Re-identification0
Contrastive Learning Inverts the Data Generating ProcessCode1
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution CalibrationCode1
Dissecting Supervised Contrastive LearningCode0
COCO-LM: Correcting and Contrasting Text Sequences for Language Model PretrainingCode1
Self-Supervised Multisensor Change Detection0
Unleashing the Power of Contrastive Self-Supervised Visual Models via Contrast-Regularized Fine-TuningCode1
Large-Scale Representation Learning on Graphs via BootstrappingCode1
Semantically-Conditioned Negative Samples for Efficient Contrastive Learning0
CDPAM: Contrastive learning for perceptual audio similarityCode1
Negative Data AugmentationCode1
Show:102550
← PrevPage 251 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified