SOTAVerified

Dual Student: Breaking the Limits of the Teacher in Semi-supervised Learning

2019-09-03ICCV 2019Code Available0· sign in to hype

Zhanghan Ke, Daoye Wang, Qiong Yan, Jimmy Ren, Rynson W. H. Lau

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Recently, consistency-based methods have achieved state-of-the-art results in semi-supervised learning (SSL). These methods always involve two roles, an explicit or implicit teacher model and a student model, and penalize predictions under different perturbations by a consistency constraint. However, the weights of these two roles are tightly coupled since the teacher is essentially an exponential moving average (EMA) of the student. In this work, we show that the coupled EMA teacher causes a performance bottleneck. To address this problem, we introduce Dual Student, which replaces the teacher with another student. We also define a novel concept, stable sample, following which a stabilization constraint is designed for our structure to be trainable. Further, we discuss two variants of our method, which produce even higher performance. Extensive experiments show that our method improves the classification performance significantly on several main SSL benchmarks. Specifically, it reduces the error rate of the 13-layer CNN from 16.84% to 12.39% on CIFAR-10 with 1k labels and from 34.10% to 31.56% on CIFAR-100 with 10k labels. In addition, our method also achieves a clear improvement in domain adaptation.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
cifar-100, 10000 LabelsDual Student (480)Percentage error32.77Unverified
CIFAR-10, 1000 LabelsDual Student (600)Accuracy85.83Unverified
CIFAR-10, 2000 LabelsDual Student (600)Accuracy89.28Unverified
CIFAR-10, 4000 LabelsDual Student (600)Percentage error8.89Unverified
ImageNet - 10% labeled dataDual StudentTop 1 Accuracy63.52Unverified
SVHN, 250 LabelsDual StudentAccuracy95.76Unverified
SVHN, 500 LabelsDual StudentAccuracy96.04Unverified

Reproductions