SOTAVerified

Effective Version Space Reduction for Convolutional Neural Networks

2020-06-22Unverified0· sign in to hype

Jiayu Liu, Ioannis Chiotellis, Rudolph Triebel, Daniel Cremers

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In active learning, sampling bias could pose a serious inconsistency problem and hinder the algorithm from finding the optimal hypothesis. However, many methods for neural networks are hypothesis space agnostic and do not address this problem. We examine active learning with convolutional neural networks through the principled lens of version space reduction. We identify the connection between two approaches---prior mass reduction and diameter reduction---and propose a new diameter-based querying method---the minimum Gibbs-vote disagreement. By estimating version space diameter and bias, we illustrate how version space of neural networks evolves and examine the realizability assumption. With experiments on MNIST, Fashion-MNIST, SVHN and STL-10 datasets, we demonstrate that diameter reduction methods reduce the version space more effectively and perform better than prior mass reduction and other baselines, and that the Gibbs vote disagreement is on par with the best query method.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
STL-10PWDPercentage correct59.45Unverified
STL-10GVDPercentage correct59.33Unverified
STL-10VRPercentage correct59.13Unverified
STL-10Core SETPercentage correct58.93Unverified
STL-10GEPercentage correct58.84Unverified
STL-10DFALPercentage correct58.81Unverified
STL-10RandomPercentage correct58.15Unverified
STL-10BALD-MCDPercentage correct57.35Unverified
STL-10M2-PWDPercentage correct57.31Unverified

Reproductions