SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 226250 of 265 papers

TitleStatusHype
Neural Grammatical Error Correction Systems with Unsupervised Pre-training on Synthetic DataCode0
Exploiting Unsupervised Pre-training and Automated Feature Engineering for Low-resource Hate Speech Detection in Polish0
Improving Relation Extraction by Pre-trained Language RepresentationsCode0
Color Constancy Convolutional Autoencoder0
UHH-LT at SemEval-2019 Task 6: Supervised vs. Unsupervised Transfer Learning for Offensive Language Detection0
Unsupervised pre-training helps to conserve views from input distribution0
Unsupervised Pre-Training of Image Features on Non-Curated DataCode0
BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task LearningCode0
Tuning Multilingual Transformers for Named Entity Recognition on Slavic LanguagesCode0
Unsupervised Transfer Learning for Spoken Language Understanding in Intelligent AgentsCode0
Leveraging Random Label Memorization for Unsupervised Pre-Training0
Advancing PICO Element Detection in Biomedical Text via Deep Neural NetworksCode0
Temporal Interpolation as an Unsupervised Pretraining Task for Optical Flow Estimation0
Deep Belief Networks Based Feature Generation and Regression for Predicting Wind Power0
Deep Discriminative Model for Video Classification0
RMDL: Random Multimodel Deep Learning for ClassificationCode0
DiscrimNet: Semi-Supervised Action Recognition from Videos using Generative Adversarial Networks0
Post-training for Deep Learning0
Co-Morbidity Exploration on Wearables Activity Data Using Unsupervised Pre-training and Multi-Task Learning0
Enhance Visual Recognition under Adverse Conditions via Deep Networks0
Self-Supervised Relative Depth Learning for Urban Scene Understanding0
A Pitfall of Unsupervised Pre-Training0
Discovery of Visual Semantics by Unsupervised and Self-Supervised Representation Learning0
A Pitfall of Unsupervised Pre-Training0
Unsupervised Pre-training With Seq2Seq Reconstruction Loss for Deep Relation Extraction Models0
Show:102550
← PrevPage 10 of 11Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified