SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 251260 of 265 papers

TitleStatusHype
Targeted Forgetting of Image Subgroups in CLIP Models0
Temporal Autoencoding Improves Generative Models of Time Series0
Temporal Interpolation as an Unsupervised Pretraining Task for Optical Flow Estimation0
The Efficiency of Pre-training with Objective Masking in Pseudo Labeling for Semi-Supervised Text Classification0
Towards General Text Embeddings with Multi-stage Contrastive Learning0
TraIL-Det: Transformation-Invariant Local Feature Networks for 3D LiDAR Object Detection with Unsupervised Pre-Training0
Transfer Learning for Context-Aware Spoken Language Understanding0
Transferring self-supervised pre-trained models for SHM data anomaly detection with scarce labeled data0
Triplet Contrastive Learning for Brain Tumor Classification0
UHH-LT at SemEval-2019 Task 6: Supervised vs. Unsupervised Transfer Learning for Offensive Language Detection0
Show:102550
← PrevPage 26 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified