SOTAVerified

Unsupervised Pre-training

Pre-training a neural network using unsupervised (self-supervised) auxiliary tasks on unlabeled data.

Papers

Showing 91100 of 265 papers

TitleStatusHype
HindiLLM: Large Language Model for Hindi0
Transferring self-supervised pre-trained models for SHM data anomaly detection with scarce labeled data0
CLAP: Unsupervised 3D Representation Learning for Fusion 3D Perception via Curvature Sampling and Prototype Learning0
Point Cloud Unsupervised Pre-training via 3D Gaussian Splatting0
Take Package as Language: Anomaly Detection Using TransformerCode0
SynCo: Synthetic Hard Negatives in Contrastive Learning for Better Unsupervised Visual RepresentationsCode0
Calibrating Language Models with Adaptive Temperature ScalingCode0
Range-aware Positional Encoding via High-order Pretraining: Theory and Practice0
GFlowNet Pretraining with Inexpensive Rewards0
TraIL-Det: Transformation-Invariant Local Feature Networks for 3D LiDAR Object Detection with Unsupervised Pre-Training0
Show:102550
← PrevPage 10 of 27Next →

Benchmark Results

#ModelMetricClaimedVerifiedStatus
115RDLsAccuracy (%)95Unverified
29RDLsAccuracy (%)94Unverified
33 RMDLAccuracy (%)93Unverified
4CNNAccuracy (%)73Unverified
5RMDLAccuracy (%)0.1Unverified
#ModelMetricClaimedVerifiedStatus
1RMDL (30 RDLs)Sensitivity (VEB)90.69Unverified
2Sensitivity89.1Unverified
3RMDL 3 RDLsSensitivity0.87Unverified