SOTAVerified

Learning with Neighbor Consistency for Noisy Labels

2022-02-04CVPR 2022Code Available0· sign in to hype

Ahmet Iscen, Jack Valmadre, Anurag Arnab, Cordelia Schmid

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recent advances in deep learning have relied on large, labelled datasets to train high-capacity models. However, collecting large datasets in a time- and cost-efficient manner often results in label noise. We present a method for learning from noisy labels that leverages similarities between training examples in feature space, encouraging the prediction of each example to be similar to its nearest neighbours. Compared to training algorithms that use multiple models or distinct stages, our approach takes the form of a simple, additional regularization term. It can be interpreted as an inductive version of the classical, transductive label propagation algorithm. We thoroughly evaluate our method on datasets evaluating both synthetic (CIFAR-10, CIFAR-100) and realistic (mini-WebVision, WebVision, Clothing1M, mini-ImageNet-Red) noise, and achieve competitive or state-of-the-art accuracies across all of them.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
mini WebVision 1.0NCR+Mixup+DA (ResNet-50)Top-1 Accuracy80.5Unverified
mini WebVision 1.0NCR (ResNet-50)Top-1 Accuracy77.1Unverified
mini WebVision 1.0NCR+Mixup (ResNet-50)Top-1 Accuracy79.4Unverified
Red MiniImageNet 20% label noiseNCR (ResNet-18)Accuracy69Unverified
Red MiniImageNet 40% label noiseNCR (ResNet-18)Accuracy64.6Unverified
Red MiniImageNet 80% label noiseNCR (ResNet-18)Accuracy51.2Unverified
WebVision-1000NCR (ResNet-50)Top-1 Accuracy75.7Unverified
WebVision-1000NCR+Mixup+DA (ResNet-50)Top-1 Accuracy76.8Unverified

Reproductions