SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 29763000 of 6661 papers

TitleStatusHype
Efficient Labelling of Affective Video Datasets via Few-Shot & Multi-Task Contrastive LearningCode0
Efficient Information Extraction in Few-Shot Relation Classification through Contrastive Representation LearningCode0
Cone: Unsupervised Contrastive Opinion ExtractionCode0
Efficient Hierarchical Contrastive Self-supervising Learning for Time Series Classification via Importance-aware Resolution SelectionCode0
Alleviating Sparsity of Open Knowledge Graphs with Ternary Contrastive LearningCode0
Improving Unsupervised Relation Extraction by Augmenting Diverse Sentence PairsCode0
Improving Unsupervised Task-driven Models of Ventral Visual Stream via Relative Position PredictivityCode0
Conditional Supervised Contrastive Learning for Fair Text ClassificationCode0
HU at SemEval-2024 Task 8A: Can Contrastive Learning Learn Embeddings to Detect Machine-Generated Text?Code0
Contrastive Transformer Learning with Proximity Data Generation for Text-Based Person SearchCode0
Improving the Robustness of Dense Retrievers Against Typos via Multi-Positive Contrastive LearningCode0
Efficient Cluster-Based k-Nearest-Neighbor Machine TranslationCode0
Conditional Negative Sampling for Contrastive Learning of Visual RepresentationsCode0
Improving the Robustness of Knowledge-Grounded Dialogue via Contrastive LearningCode0
Efficient block contrastive learning via parameter-free meta-node approximationCode0
Improving the Consistency in Cross-Lingual Cross-Modal Retrieval with 1-to-K Contrastive LearningCode0
Adaptive Contrastive Learning with Dynamic Correlation for Multi-Phase Organ SegmentationCode0
Contrastive Variational Autoencoder Enhances Salient FeaturesCode0
Arithmetic-Based Pretraining -- Improving Numeracy of Pretrained Language ModelsCode0
Efficient and Interpretable Information Retrieval for Product Question Answering with Heterogeneous DataCode0
Efficient Adversarial Contrastive Learning via Robustness-Aware Coreset SelectionCode0
Can Self-Supervised Representation Learning Methods Withstand Distribution Shifts and Corruptions?Code0
Adaptive Contrastive Learning on Multimodal Transformer for Review Helpfulness PredictionsCode0
Effective Open Intent Classification with K-center Contrastive Learning and Adjustable Decision BoundaryCode0
ConCur: Self-supervised graph representation based on contrastive learning with curriculum negative samplingCode0
Show:102550
← PrevPage 120 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified