SOTAVerified

Learning with noisy labels

Learning with noisy labels means When we say "noisy labels," we mean that an adversary has intentionally messed up the labels, which would have come from a "clean" distribution otherwise. This setting can also be used to cast learning from only positive and unlabeled data.

Papers

Showing 150 of 249 papers

TitleStatusHype
NLPrompt: Noise-Label Prompt Learning for Vision-Language ModelsCode2
Sharpness-Aware Minimization for Efficiently Improving GeneralizationCode2
SURE: SUrvey REcipes for building reliable and robust deep networksCode2
Learning with Noisy Labels via Sparse RegularizationCode1
Learning with Instance-Dependent Label Noise: A Sample Sieve ApproachCode1
Learning with Noisy labels via Self-supervised Adversarial Noisy MaskingCode1
Joint Class-Affinity Loss Correction for Robust Medical Image Segmentation with Noisy LabelsCode1
From Noisy Prediction to True Label: Noisy Prediction Calibration via Generative ModelCode1
Imprecise Label Learning: A Unified Framework for Learning with Various Imprecise Label ConfigurationsCode1
Jigsaw-ViT: Learning Jigsaw Puzzles in Vision TransformerCode1
L2B: Learning to Bootstrap Robust Models for Combating Label NoiseCode1
Learning with Feature-Dependent Label Noise: A Progressive ApproachCode1
Learning with Noisy Labels Revisited: A Study Using Real-World Human AnnotationsCode1
Asymmetric Loss Functions for Learning with Noisy LabelsCode1
Dirichlet-Based Prediction Calibration for Learning with Noisy LabelsCode1
Augmentation Strategies for Learning with Noisy LabelsCode1
Bayesian Optimization Meets Self-DistillationCode1
Few-shot Learning with Noisy LabelsCode1
Hard Sample Aware Noise Robust Learning for Histopathology Image ClassificationCode1
DAT: Training Deep Networks Robust To Label-Noise by Matching the Feature DistributionsCode1
Improving Generalization by Controlling Label-Noise Information in Neural Network WeightsCode1
Is BERT Robust to Label Noise? A Study on Learning with Noisy Labels in Text ClassificationCode1
DISC: Learning From Noisy Labels via Dynamic Instance-Specific Selection and CorrectionCode1
Learn From All: Erasing Attention Consistency for Noisy Label Facial Expression RecognitionCode1
DivideMix: Learning with Noisy Labels as Semi-supervised LearningCode1
Early-Learning Regularization Prevents Memorization of Noisy LabelsCode1
Learning with Noisy Labels by Efficient Transition Matrix Estimation to Combat Label MiscorrectionCode1
Learning with Noisy Labels for Robust Point Cloud SegmentationCode1
CLIPCleaner: Cleaning Noisy Labels with CLIPCode1
Clusterability as an Alternative to Anchor Points When Learning with Noisy LabelsCode1
Co-Correcting: Noise-tolerant Medical Image Classification via mutual Label CorrectionCode1
CSOT: Curriculum and Structure-Aware Optimal Transport for Learning with Noisy LabelsCode1
AlleNoise: large-scale text classification benchmark dataset with real-world label noiseCode1
Co-learning: Learning from Noisy Labels with Self-supervisionCode1
Co-Learning Meets Stitch-Up for Noisy Multi-label Visual RecognitionCode1
Collaborative Noisy Label Cleaner: Learning Scene-aware Trailers for Multi-modal Highlight Detection in MoviesCode1
FedNoisy: Federated Noisy Label Learning BenchmarkCode1
Combating noisy labels by agreement: A joint training method with co-regularizationCode1
Dirichlet-based Per-Sample Weighting by Transition Matrix for Noisy Label LearningCode1
Generalized Jensen-Shannon Divergence Loss for Learning with Noisy LabelsCode1
Boosting Co-teaching with Compression Regularization for Label NoiseCode1
Comparing Kullback-Leibler Divergence and Mean Squared Error Loss in Knowledge DistillationCode1
Compressing Features for Learning with Noisy LabelsCode1
Mitigating Memorization of Noisy Labels via Regularization between RepresentationsCode1
Improving Medical Image Classification in Noisy Labels Using Only Self-supervised PretrainingCode1
Instance-Dependent Noisy Label Learning via Graphical ModellingCode1
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy LabelsCode1
Active Negative Loss: A Robust Framework for Learning with Noisy LabelsCode1
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy LabelsCode1
Faster Meta Update Strategy for Noise-Robust Deep LearningCode1
Show:102550
← PrevPage 1 of 5Next →

No leaderboard results yet.