SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 221230 of 265 papers

TitleStatusHype
Maximal Multiverse Learning for Promoting Cross-Task Generalization of Fine-Tuned Language Models0
Measles Rash Identification Using Residual Deep Convolutional Neural Network0
MLIP: Enhancing Medical Visual Representation with Divergence Encoder and Knowledge-guided Contrastive Learning0
Multi-Modal Unsupervised Pre-Training for Surgical Operating Room Workflow Analysis0
Multi-Stage Multi-Modal Pre-Training for Automatic Speech Recognition0
On the Generalization Ability of Unsupervised Pretraining0
Unsupervised Deep Representation Learning and Few-Shot Classification of PolSAR Images0
Point Cloud Unsupervised Pre-training via 3D Gaussian Splatting0
Post-training for Deep Learning0
Pre-Training and Fine-Tuning Generative Flow Networks0
Show:102550
← PrevPage 23 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified