SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 30513100 of 6661 papers

TitleStatusHype
Improving Unsupervised Task-driven Models of Ventral Visual Stream via Relative Position PredictivityCode0
Composition-contrastive Learning for Sentence EmbeddingsCode0
IITK at SemEval-2024 Task 1: Contrastive Learning and Autoencoders for Semantic Textual Relatedness in Multilingual TextsCode0
Self-supervised Vision Transformers for 3D Pose Estimation of Novel ObjectsCode0
Auto-Formula: Recommend Formulas in Spreadsheets using Contrastive Learning for Table RepresentationsCode0
Improving Unsupervised Relation Extraction by Augmenting Diverse Sentence PairsCode0
Compositional Image Retrieval via Instruction-Aware Contrastive LearningCode0
Improving the Robustness of Dense Retrievers Against Typos via Multi-Positive Contrastive LearningCode0
Adapting to Change: Robust Counterfactual Explanations in Dynamic Data LandscapesCode0
Architecture Matters: Uncovering Implicit Mechanisms in Graph Contrastive LearningCode0
Improving the Robustness of Knowledge-Grounded Dialogue via Contrastive LearningCode0
Semantic Information in Contrastive LearningCode0
Improving Sentence Similarity Estimation for Unsupervised Extractive SummarizationCode0
DyTSCL: Dynamic graph representation via tempo-structural contrastive learningCode0
All4One: Symbiotic Neighbour Contrastive Learning via Self-Attention and Redundancy ReductionCode0
Improving the Consistency in Cross-Lingual Cross-Modal Retrieval with 1-to-K Contrastive LearningCode0
Dynamically Scaled Temperature in Self-Supervised Contrastive LearningCode0
AuralSAM2: Enabling SAM2 Hear Through Pyramid Audio-Visual Feature PromptingCode0
Invariant Graph Learning Meets Information Bottleneck for Out-of-Distribution GeneralizationCode0
Arithmetic-Based Pretraining -- Improving Numeracy of Pretrained Language ModelsCode0
ImpScore: A Learnable Metric For Quantifying The Implicitness Level of LanguageCode0
Improving Multi-lingual Alignment Through Soft Contrastive LearningCode0
A Universal Knowledge Embedded Contrastive Learning Framework for Hyperspectral Image ClassificationCode0
Complementary Calibration: Boosting General Continual Learning with Collaborative Distillation and Self-SupervisionCode0
Improving Nonlinear Projection Heads using Pretrained Autoencoder EmbeddingsCode0
Improving Long-tailed Object Detection with Image-Level Supervision by Multi-Task Collaborative LearningCode0
A Universal Framework for Compressing Embeddings in CTR PredictionCode0
Improving Medical Multi-modal Contrastive Learning with Expert AnnotationsCode0
Comparing representations of biological data learned with different AI paradigms, augmenting and cropping strategiesCode0
Improving Language Transfer Capability of Decoder-only Architecture in Multilingual Neural Machine TranslationCode0
Improving Micro-video Recommendation via Contrastive Multiple InterestsCode0
Implicit Contrastive Representation Learning with Guided Stop-gradientCode0
Improving Paratope and Epitope Prediction by Multi-Modal Contrastive Learning and Interaction Informativeness EstimationCode0
Improving Fairness of Automated Chest X-ray Diagnosis by Contrastive LearningCode0
Dynamic Graph Representation with Contrastive Learning for Financial Market Prediction: Integrating Temporal Evolution and Static RelationsCode0
Improving Factuality of Abstractive Summarization without Sacrificing Summary QualityCode0
Calibrating and Improving Graph Contrastive LearningCode0
Co-modeling the Sequential and Graphical Routes for Peptide Representation LearningCode0
Adapting Pretrained Language Models for Citation Classification via Self-Supervised Contrastive LearningCode0
Correlation between Alignment-Uniformity and Performance of Dense Contrastive RepresentationsCode0
Dynamic Contrastive Learning for Time Series RepresentationCode0
Improving Query-by-Vocal Imitation with Contrastive Learning and Audio PretrainingCode0
Symmetric Graph Contrastive Learning against Noisy Views for RecommendationCode0
Improving Contrastive Learning of Sentence Embeddings with Focal-InfoNCECode0
Communicate to Play: Pragmatic Reasoning for Efficient Cross-Cultural Communication in CodenamesCode0
Commonsense Knowledge Graph Completion Via Contrastive Pretraining and Node ClusteringCode0
Improving Contrastive Learning for Referring Expression CountingCode0
DWE+: Dual-Way Matching Enhanced Framework for Multimodal Entity LinkingCode0
DWCL: Dual-Weighted Contrastive Learning for Multi-View ClusteringCode0
A Unified Membership Inference Method for Visual Self-supervised Encoder via Part-aware CapabilityCode0
Show:102550
← PrevPage 62 of 134Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified