SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 211220 of 265 papers

TitleStatusHype
Unsupervised Pre-training With Seq2Seq Reconstruction Loss for Deep Relation Extraction Models0
Unsupervised Pre-training with Structured Knowledge for Improving Natural Language Inference0
VietMed: A Dataset and Benchmark for Automatic Speech Recognition of Vietnamese in the Medical Domain0
Weakly Supervised Construction of ASR Systems with Massive Video Data0
What is the Best Feature Learning Procedure in Hierarchical Recognition Architectures?0
What Makes for Good Views for Contrastive Learning?0
Range-aware Positional Encoding via High-order Pretraining: Theory and Practice0
Recognizing UMLS Semantic Types with Deep Learning0
Representation Learning for Weakly Supervised Relation Extraction0
Pre-train and Learn: Preserve Global Information for Graph Neural NetworksCode0
Show:102550
← PrevPage 22 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified