SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 56265650 of 6661 papers

TitleStatusHype
Explicit View-labels Matter: A Multifacet Complementarity Study of Multi-view Clustering0
CODE-MVP: Learning to Represent Source Code from Multiple Views with Contrastive Pre-Training0
Lexical Knowledge Internalization for Neural Dialog GenerationCode0
Analysing the Robustness of Dual Encoders for Dense Retrieval Against MisspellingsCode0
Do More Negative Samples Necessarily Hurt in Contrastive Learning?0
i-Code: An Integrative and Composable Multimodal Learning Framework0
FastGCL: Fast Self-Supervised Learning on Graphs via Contrastive Neighborhood Aggregation0
Attention-wise masked graph contrastive learning for predicting molecular property0
A Contrastive Framework for Learning Sentence Representations from Pairwise and Triple-wise Perspective in Angular Space0
KNN-Contrastive Learning for Out-of-Domain Intent Classification0
Contrastive Learning-Enhanced Nearest Neighbor Mechanism for Multi-Label Text Classification0
Cross-Lingual Contrastive Learning for Fine-Grained Entity Typing for Low-Resource LanguagesCode0
GL-CLeF: A Global–Local Contrastive Learning Framework for Cross-lingual Spoken Language UnderstandingCode0
Controlled Text Generation Using Dictionary Prior in Variational Autoencoders0
When does CLIP generalize better than unimodal models? When judging human-centric concepts0
Mitigating the Inconsistency Between Word Saliency and Model Confidence with Pathological Contrastive Training0
Mitigating Contradictions in Dialogue Based on Contrastive Learning0
Syntax-guided Contrastive Learning for Pre-trained Language Model0
RoCBert: Robust Chinese Bert with Multimodal Contrastive Pretraining0
NYCU_TWD@LT-EDI-ACL2022: Ensemble Models with VADER and Contrastive Learning for Detecting Signs of Depression from Social Media0
UTC: A Unified Transformer with Inter-Task Contrastive Learning for Visual Dialog0
Utilizing Cross-Modal Contrastive Learning to Improve Item Categorization BERT Model0
Probing Cross-Lingual Lexical Knowledge from Multilingual Sentence Encoders0
Loss Function Entropy Regularization for Diverse Decision Boundaries0
Heterogeneous Graph Neural Networks using Self-supervised Reciprocally Contrastive Learning0
Show:102550
← PrevPage 226 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified