SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 16511675 of 6661 papers

TitleStatusHype
CSP: Self-Supervised Contrastive Spatial Pre-Training for Geospatial-Visual RepresentationsCode1
Improving Composed Image Retrieval via Contrastive Learning with Scaling Positives and NegativesCode1
Towards Cross-Table Masked Pretraining for Web Data MiningCode1
Improving Contrastive Learning by Visualizing Feature TransformationCode1
A Simple and Effective Self-Supervised Contrastive Learning Framework for Aspect DetectionCode1
Improving Contrastive Learning on Imbalanced Seed Data via Open-World SamplingCode1
Improving Contrastive Learning of Sentence Embeddings from AI FeedbackCode1
Pretext-Contrastive Learning: Toward Good Practices in Self-supervised Video Representation LeaningCode1
CL4CTR: A Contrastive Learning Framework for CTR PredictionCode1
Improving Contrastive Learning on Imbalanced Data via Open-World SamplingCode1
CURL: Contrastive Unsupervised Representation Learning for Reinforcement LearningCode1
CURL: Contrastive Unsupervised Representations for Reinforcement LearningCode1
CLAD: Robust Audio Deepfake Detection Against Manipulation Attacks with Contrastive LearningCode1
Improving Hateful Meme Detection through Retrieval-Guided Contrastive LearningCode1
Improving Text-to-Image Synthesis Using Contrastive LearningCode1
Improving Word Translation via Two-Stage Contrastive LearningCode1
Data-Efficient Contrastive Self-supervised Learning: Most Beneficial Examples for Supervised Learning Contribute the LeastCode1
A Simple Contrastive Learning Objective for Alleviating Neural Text DegenerationCode1
Company-as-Tribe: Company Financial Risk Assessment on Tribe-Style Graph with Hierarchical Graph Neural NetworksCode1
Improving Molecular Contrastive Learning via Faulty Negative Mitigation and Decomposed Fragment ContrastCode1
A simple, efficient and scalable contrastive masked autoencoder for learning visual representationsCode1
Improving Self-Supervised Learning by Characterizing Idealized RepresentationsCode1
CLAMP-ViT: Contrastive Data-Free Learning for Adaptive Post-Training Quantization of ViTsCode1
Semi-supervised Crowd Counting via Density AgencyCode1
Leveraging Hidden Positives for Unsupervised Semantic SegmentationCode1
Show:102550
← PrevPage 67 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified