SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 231240 of 265 papers

TitleStatusHype
Pretraining Generative Flow Networks with Inexpensive Rewards for Molecular Graph Generation0
Pre-training Text Representations as Meta Learning0
Provable Benefits of Unsupervised Pre-training and Transfer Learning via Single-Index Models0
Range-aware Positional Encoding via High-order Pretraining: Theory and Practice0
Recognizing UMLS Semantic Types with Deep Learning0
Representation Learning for Weakly Supervised Relation Extraction0
Research on CPI Prediction Based on Natural Language Processing0
Residual Contrastive Learning for Image Reconstruction: Learning Transferable Representations from Noisy Images0
Risk Assessment Framework for Code LLMs via Leveraging Internal States0
R-LAtte: Attention Module for Visual Control via Reinforcement Learning0
Show:102550
← PrevPage 24 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified