SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 131140 of 265 papers

TitleStatusHype
Deeply Unsupervised Patch Re-Identification for Pre-training Object Detectors0
Unsupervised Pre-Training for Vietnamese Automatic Speech Recognition in the HYKIST Project0
Unsupervised Pre-Training for Vietnamese Automatic Speech Recognition in the HYKIST Project0
Unsupervised pre-training helps to conserve views from input distribution0
Unsupervised Pre-Training Using Masked Autoencoders for ECG Analysis0
Unsupervised Pre-training With Seq2Seq Reconstruction Loss for Deep Relation Extraction Models0
Unsupervised Pre-training with Structured Knowledge for Improving Natural Language Inference0
VietMed: A Dataset and Benchmark for Automatic Speech Recognition of Vietnamese in the Medical Domain0
Weakly Supervised Construction of ASR Systems with Massive Video Data0
What is the Best Feature Learning Procedure in Hierarchical Recognition Architectures?0
Show:102550
← PrevPage 14 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified