SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 101110 of 265 papers

TitleStatusHype
Unsupervised pre-training of graph transformers on patient population graphsCode1
SatMAE: Pre-training Transformers for Temporal and Multi-Spectral Satellite ImageryCode2
Multi-Modal Unsupervised Pre-Training for Surgical Operating Room Workflow Analysis0
Unsupervised Instance Discriminative Learning for Depression Detection from Speech Signals0
Contextual embedding and model weighting by fusing domain knowledge on Biomedical Question AnsweringCode0
CARLANE: A Lane Detection Benchmark for Unsupervised Domain Adaptation from Simulation to multiple Real-World DomainsCode1
Curriculum-Based Self-Training Makes Better Few-Shot Learners for Data-to-Text GenerationCode0
Self-Supervised Visual Representation Learning with Semantic GroupingCode1
Deep Features for CBIR with Scarce Data using Hebbian Learning0
Semi-supervised 3D shape segmentation with multilevel consistency and part substitutionCode1
Show:102550
← PrevPage 11 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified