SOTAVerified

Learning with noisy labels

Learning with noisy labels means When we say "noisy labels," we mean that an adversary has intentionally messed up the labels, which would have come from a "clean" distribution otherwise. This setting can also be used to cast learning from only positive and unlabeled data.

Papers

Showing 2650 of 249 papers

TitleStatusHype
L2B: Learning to Bootstrap Robust Models for Combating Label NoiseCode1
Learning with Noisy Labels by Efficient Transition Matrix Estimation to Combat Label MiscorrectionCode1
Learning with Noisy Labels for Robust Point Cloud SegmentationCode1
CLIPCleaner: Cleaning Noisy Labels with CLIPCode1
Clusterability as an Alternative to Anchor Points When Learning with Noisy LabelsCode1
Co-Correcting: Noise-tolerant Medical Image Classification via mutual Label CorrectionCode1
CSOT: Curriculum and Structure-Aware Optimal Transport for Learning with Noisy LabelsCode1
AlleNoise: large-scale text classification benchmark dataset with real-world label noiseCode1
Co-learning: Learning from Noisy Labels with Self-supervisionCode1
Co-Learning Meets Stitch-Up for Noisy Multi-label Visual RecognitionCode1
Collaborative Noisy Label Cleaner: Learning Scene-aware Trailers for Multi-modal Highlight Detection in MoviesCode1
DAT: Training Deep Networks Robust To Label-Noise by Matching the Feature DistributionsCode1
Combating noisy labels by agreement: A joint training method with co-regularizationCode1
Dirichlet-based Per-Sample Weighting by Transition Matrix for Noisy Label LearningCode1
From Noisy Prediction to True Label: Noisy Prediction Calibration via Generative ModelCode1
Boosting Co-teaching with Compression Regularization for Label NoiseCode1
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge DistillationCode1
Compressing Features for Learning with Noisy LabelsCode1
Mitigating Memorization of Noisy Labels via Regularization between RepresentationsCode1
Improving Generalization by Controlling Label-Noise Information in Neural Network WeightsCode1
Improving Medical Image Classification in Noisy Labels Using Only Self-supervised PretrainingCode1
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy LabelsCode1
Active Negative Loss: A Robust Framework for Learning with Noisy LabelsCode1
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy LabelsCode1
Early-Learning Regularization Prevents Memorization of Noisy LabelsCode1
Show:102550
← PrevPage 2 of 10Next →

No leaderboard results yet.