SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 91100 of 265 papers

TitleStatusHype
Self-Supervised Modality-Agnostic Pre-Training of Swin TransformersCode0
SMILES Transformer: Pre-trained Molecular Fingerprint for Low Data Drug DiscoveryCode0
Post Training in Deep Learning with Last KernelCode0
Pre-train and Learn: Preserve Global Information for Graph Neural NetworksCode0
COLA: COarse LAbel pre-training for 3D semantic segmentation of sparse LiDAR datasetsCode0
Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic DataCode0
CochCeps-Augment: A Novel Self-Supervised Contrastive Learning Using Cochlear Cepstrum-based Masking for Speech Emotion RecognitionCode0
MML: Maximal Multiverse Learning for Robust Fine-Tuning of Language ModelsCode0
Unsupervised Pre-Training of Image Features on Non-Curated DataCode0
m2caiSeg: Semantic Segmentation of Laparoscopic Images using Convolutional Neural NetworksCode0
Show:102550
← PrevPage 10 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified