SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 10261050 of 6661 papers

TitleStatusHype
CPLIP: Zero-Shot Learning for Histopathology with Comprehensive Vision-Language AlignmentCode1
A Brain Graph Foundation Model: Pre-Training and Prompt-Tuning for Any Atlas and DisorderCode1
CP2: Copy-Paste Contrastive Pretraining for Semantic SegmentationCode1
CrossCBR: Cross-view Contrastive Learning for Bundle RecommendationCode1
Cross-Modal Information-Guided Network using Contrastive Learning for Point Cloud RegistrationCode1
COSTA: Covariance-Preserving Feature Augmentation for Graph Contrastive LearningCode1
Bag of Instances Aggregation Boosts Self-supervised DistillationCode1
BadHash: Invisible Backdoor Attacks against Deep Hashing with Clean LabelCode1
BadCLIP: Dual-Embedding Guided Backdoor Attack on Multimodal Contrastive LearningCode1
CoSQA: 20,000+ Web Queries for Code Search and Question AnsweringCode1
CoT-BERT: Enhancing Unsupervised Sentence Representation through Chain-of-ThoughtCode1
Correspondence Matters for Video Referring Expression ComprehensionCode1
Correct-N-Contrast: A Contrastive Approach for Improving Robustness to Spurious CorrelationsCode1
CorruptEncoder: Data Poisoning based Backdoor Attacks to Contrastive LearningCode1
Constrained Contrastive Distribution Learning for Unsupervised Anomaly Detection and Localisation in Medical ImagesCode1
FLIP: Fine-grained Alignment between ID-based Models and Pretrained Language Models for CTR PredictionCode1
COPNER: Contrastive Learning with Prompt Guiding for Few-shot Named Entity RecognitionCode1
CoRTX: Contrastive Framework for Real-time ExplanationCode1
Contrastive Domain Adaptation for Time-Series via Temporal MixupCode1
Adaptive Graph Contrastive Learning for RecommendationCode1
ContrastVAE: Contrastive Variational AutoEncoder for Sequential RecommendationCode1
Learning the Unlearned: Mitigating Feature Suppression in Contrastive LearningCode1
Contrast then Memorize: Semantic Neighbor Retrieval-Enhanced Inductive Multimodal Knowledge Graph CompletionCode1
CONVERT:Contrastive Graph Clustering with Reliable AugmentationCode1
ContrastNet: A Contrastive Learning Framework for Few-Shot Text ClassificationCode1
Show:102550
← PrevPage 42 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified