SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 181190 of 265 papers

TitleStatusHype
Spiral Contrastive Learning: An Efficient 3D Representation Learning Method for Unannotated CT Lesions0
SYNFIX: Automatically Fixing Syntax Errors using Compiler Diagnostics0
Synthetic vascular structure generation for unsupervised pre-training in CTA segmentation tasks0
Targeted Forgetting of Image Subgroups in CLIP Models0
Temporal Autoencoding Improves Generative Models of Time Series0
Temporal Interpolation as an Unsupervised Pretraining Task for Optical Flow Estimation0
The Efficiency of Pre-training with Objective Masking in Pseudo Labeling for Semi-Supervised Text Classification0
Towards General Text Embeddings with Multi-stage Contrastive Learning0
TraIL-Det: Transformation-Invariant Local Feature Networks for 3D LiDAR Object Detection with Unsupervised Pre-Training0
Transfer Learning for Context-Aware Spoken Language Understanding0
Show:102550
← PrevPage 19 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified