SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 6170 of 265 papers

TitleStatusHype
CUPre: Cross-domain Unsupervised Pre-training for Few-Shot Cell Segmentation0
Classifying Whole Slide Images: What Matters?0
Pre-Training and Fine-Tuning Generative Flow Networks0
Differentially Private Optimization for Non-Decomposable Objective Functions0
A Brief History of Prompt: Leveraging Language Models. (Through Advanced Prompting)0
Unsupervised Pre-Training for Vietnamese Automatic Speech Recognition in the HYKIST Project0
Examining the Effect of Pre-training on Time Series Classification0
Self-Supervised Pre-Training Boosts Semantic Scene Segmentation on LiDAR DataCode0
Enhancing the vocal range of single-speaker singing voice synthesis with melody-unsupervised pre-training0
Masked Feature Modelling: Feature Masking for the Unsupervised Pre-training of a Graph Attention Network Block for Bottom-up Video Event Recognition0
Show:102550
← PrevPage 7 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified