SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 3140 of 265 papers

TitleStatusHype
Learning Unsupervised Gaze Representation via Eye Mask Driven Information Bottleneck0
ConStyle v2: A Strong Prompter for All-in-One Image RestorationCode1
ExPLoRA: Parameter-Efficient Extended Pre-Training to Adapt Vision Transformers under Domain Shifts0
An Investigation of Noise Robustness for Flow-Matching-Based Zero-Shot TTS0
Large Brain Model for Learning Generic Representations with Tremendous EEG Data in BCICode4
PEAC: Unsupervised Pre-training for Cross-Embodiment Reinforcement LearningCode1
Unsupervised Pre-training with Language-Vision Prompts for Low-Data Instance SegmentationCode0
Self-Supervised Modality-Agnostic Pre-Training of Swin TransformersCode0
Dynamic data sampler for cross-language transfer learning in large language modelsCode7
Decoupling Exploration and Exploitation for Unsupervised Pre-training with Successor FeaturesCode0
Show:102550
← PrevPage 4 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified