SOTAVerified

Contrastive Learning

Contrastive Learning is a deep learning technique for unsupervised representation learning. The goal is to learn a representation of data such that similar instances are close together in the representation space, while dissimilar instances are far apart.

It has been shown to be effective in various computer vision and natural language processing tasks, including image retrieval, zero-shot learning, and cross-modal retrieval. In these tasks, the learned representations can be used as features for downstream tasks such as classification and clustering.

(Image credit: Schroff et al. 2015)

Papers

Showing 43264350 of 6661 papers

TitleStatusHype
Contrastive Learning with Negative Sampling Correction0
Preventing Collapse in Contrastive Learning with Orthonormal Prototypes (CLOP)0
Contrastive Learning with Positive-Negative Frame Mask for Music Representation0
Contrastively Enforcing Distinctiveness for Multi-Label Classification0
Contrastive masked auto-encoders based self-supervised hashing for 2D image and 3D point cloud cross-modal retrieval0
Contrastive Masked Autoencoders for Character-Level Open-Set Writer Identification0
Contrastive Mean-Shift Learning for Generalized Category Discovery0
Contrastive Multi-graph Learning with Neighbor Hierarchical Sifting for Semi-supervised Text Classification0
Contrastive Multi-Level Graph Neural Networks for Session-based Recommendation0
Contrastive Multi-Modal Representation Learning for Spark Plug Fault Diagnosis0
Contrastive Multi-Task Dense Prediction0
CSI: Contrastive Data Stratification for Interaction Prediction and its Application to Compound-Protein Interaction Prediction0
Contrastive Multi-view Framework for Customer Lifetime Value Prediction0
Contrastive Multi-view Subspace Clustering of Hyperspectral Images based on Graph Convolutional Networks0
Contrastive Mutual Information Maximization for Binary Neural Networks0
Contrastive News and Social Media Linking using BERT for Articles and Tweets across Dual Platforms0
Contrastive Perplexity for Controlled Generation: An Application in Detoxifying Large Language Models0
Contrastive Predictive Autoencoders for Dynamic Point Cloud Self-Supervised Learning0
Contrastive Predictive Coding for Anomaly Detection0
Contrastive Pre-training for Deep Session Data Understanding0
Contrastive pretraining for semantic segmentation is robust to noisy positive pairs0
Contrastive Pre-training for Zero-Shot Information Retrieval0
Pre-training General Trajectory Embeddings with Maximum Multi-view Entropy Coding0
Contrastive Prompt Learning-based Code Search based on Interaction Matrix0
Contrastive Quant: Quantization Makes Stronger Contrastive Learning0
Show:102550
← PrevPage 174 of 267Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
1ResNet50ImageNet Top-1 Accuracy73.6Unverified
2ResNet50ImageNet Top-1 Accuracy73Unverified
3ResNet50ImageNet Top-1 Accuracy71.1Unverified
4ResNet50ImageNet Top-1 Accuracy69.3Unverified
5ResNet50 (v2)ImageNet Top-1 Accuracy67.6Unverified
6ResNet50 (v2)ImageNet Top-1 Accuracy63.8Unverified
7ResNet50ImageNet Top-1 Accuracy63.6Unverified
8ResNet50ImageNet Top-1 Accuracy61.5Unverified
9ResNet50ImageNet Top-1 Accuracy61.5Unverified
10ResNet50 (4×)ImageNet Top-1 Accuracy61.3Unverified
#ModelMetricClaimedVerifiedStatus
110..5sec1Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)84.77Unverified
#ModelMetricClaimedVerifiedStatus
1IPCL (ResNet18)Accuracy (Top-1)85.55Unverified