SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 141150 of 265 papers

TitleStatusHype
Greedy-layer Pruning: Speeding up Transformer Models for Natural Language ProcessingCode0
Initialization and Regularization of Factorized Neural LayersCode1
Audio Transformers0
SYNFIX: Automatically Fixing Syntax Errors using Compiler Diagnostics0
A Large-Scale Study on Unsupervised Spatiotemporal Representation LearningCode0
Representation Learning for Weakly Supervised Relation Extraction0
Patient Contrastive Learning: a Performant, Expressive, and Practical Approach to ECG ModelingCode1
On Architectures and Training for Raw Waveform Feature Extraction in ASR0
Maximal Multiverse Learning for Promoting Cross-Task Generalization of Fine-Tuned Language Models0
Seasonal Contrast: Unsupervised Pre-Training from Uncurated Remote Sensing DataCode1
Show:102550
← PrevPage 15 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified