SOTAVerified

Multi-Task Curriculum Framework for Open-Set Semi-Supervised Learning

2020-07-22ECCV 2020Unverified0· sign in to hype

Qing Yu, Daiki Ikami, Go Irie, Kiyoharu Aizawa

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Semi-supervised learning (SSL) has been proposed to leverage unlabeled data for training powerful models when only limited labeled data is available. While existing SSL methods assume that samples in the labeled and unlabeled data share the classes of their samples, we address a more complex novel scenario named open-set SSL, where out-of-distribution (OOD) samples are contained in unlabeled data. Instead of training an OOD detector and SSL separately, we propose a multi-task curriculum learning framework. First, to detect the OOD samples in unlabeled data, we estimate the probability of the sample belonging to OOD. We use a joint optimization framework, which updates the network parameters and the OOD score alternately. Simultaneously, to achieve high performance on the classification of in-distribution (ID) data, we select ID samples in unlabeled data having small OOD scores, and use these data with labeled data for training the deep neural networks to classify ID samples in a semi-supervised manner. We conduct several experiments, and our method achieves state-of-the-art results by successfully eliminating the effect of OOD samples.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CIFAR-10, 100 Labels (OpenSet, 6/4)MTCAccuracy86.6Unverified
CIFAR-10, 400 Labels (OpenSet, 6/4)MTCAccuracy91Unverified
CIFAR-10, 50 Labels (OpenSet, 6/4)MTCAccuracy79.7Unverified

Reproductions