SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 76100 of 265 papers

TitleStatusHype
Pre-training Contextualized World Models with In-the-wild Videos for Reinforcement LearningCode1
Just Ask for Calibration: Strategies for Eliciting Calibrated Confidence Scores from Language Models Fine-Tuned with Human FeedbackCode0
Rethinking Semi-supervised Learning with Language ModelsCode1
PTGB: Pre-Train Graph Neural Networks for Brain Network AnalysisCode1
LATTE: Label-efficient Incident Phenotyping from Longitudinal Electronic Health RecordsCode0
FreePoint: Unsupervised Point Cloud Instance SegmentationCode1
Don't Stop Pretraining? Make Prompt-based Fine-tuning Powerful LearnerCode1
PUNR: Pre-training with User Behavior Modeling for News RecommendationCode0
Incorporating Unlabelled Data into Bayesian Neural Networks0
Unsupervised Pre-Training For Data-Efficient Text-to-Speech On Low Resource LanguagesCode1
MultiTalent: A Multi-Dataset Approach to Medical Image SegmentationCode1
Generalized 3D Self-supervised Learning Framework via Prompted Foreground-Aware Feature ContrastCode0
Research on CPI Prediction Based on Natural Language Processing0
Device Tuning for Multi-Task Large Model0
A Benchmark of Nested Named Entity Recognition Approaches in Historical Structured Documents0
DocILE Benchmark for Document Information Localization and ExtractionCode1
Boosting Low-Data Instance Segmentation by Unsupervised Pre-training with Saliency Prompt0
Self-FuseNet: Data Free Unsupervised Remote Sensing Image Super-Resolution0
Unsupervised Pre-Training for Vietnamese Automatic Speech Recognition in the HYKIST Project0
Wukong-Reader: Multi-modal Pre-training for Fine-grained Visual Document UnderstandingCode0
ACROBAT -- a multi-stain breast cancer histological whole-slide-image data set from routine diagnostics for computational pathology0
EUCLID: Towards Efficient Unsupervised Reinforcement Learning with Multi-choice Dynamics Model0
Spiral Contrastive Learning: An Efficient 3D Representation Learning Method for Unannotated CT Lesions0
Evaluate Confidence Instead of Perplexity for Zero-shot Commonsense Reasoning0
ProposalContrast: Unsupervised Pre-training for LiDAR-based 3D Object DetectionCode1
Show:102550
← PrevPage 4 of 11Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified