SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 101110 of 265 papers

TitleStatusHype
Pre-train and Learn: Preserve Global Information for Graph Neural NetworksCode0
Post Training in Deep Learning with Last KernelCode0
CochCeps-Augment: A Novel Self-Supervised Contrastive Learning Using Cochlear Cepstrum-based Masking for Speech Emotion RecognitionCode0
Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic DataCode0
MML: Maximal Multiverse Learning for Robust Fine-Tuning of Language ModelsCode0
RMDL: Random Multimodel Deep Learning for ClassificationCode0
Learning of feature points without additional supervision improves reinforcement learning from imagesCode0
m2caiSeg: Semantic Segmentation of Laparoscopic Images using Convolutional Neural NetworksCode0
Calibrating Language Models with Adaptive Temperature ScalingCode0
Learning Deep Representations Using Convolutional Auto-encoders with Symmetric Skip ConnectionsCode0
Show:102550
← PrevPage 11 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified