SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 111120 of 265 papers

TitleStatusHype
Learning Non-Linear Reconstruction Models for Image Set Classification0
Machine Translation Pre-training for Data-to-Text Generation - A Case Study in Czech0
FairSISA: Ensemble Post-Processing to Improve Fairness of Unlearning in LLMs0
Generalized 3D Self-supervised Learning Framework via Prompted Foreground-Aware Feature Contrast0
Co-Morbidity Exploration on Wearables Activity Data Using Unsupervised Pre-training and Multi-Task Learning0
Incomplete Multi-View Multi-label Learning via Disentangled Representation and Label Semantic Embedding0
Incorporating Unlabelled Data into Bayesian Neural Networks0
Extractive NarrativeQA with Heuristic Pre-Training0
Is MixIT Really Unsuitable for Correlated Sources? Exploring MixIT for Unsupervised Pre-training in Music Source Separation0
Extracting UMLS Concepts from Medical Text Using General and Domain-Specific Deep Learning Models0
Show:102550
← PrevPage 12 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified