SOTAVerified

SABAL: Sparse Approximation-based Batch Active Learning

2021-09-29Unverified0· sign in to hype

Maohao Shen, Bowen Jiang, Jacky Y. Zhang, Oluwasanmi O Koyejo

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We propose a novel and general framework (i.e., SABAL) that formulates batch active learning as a sparse approximation problem. SABAL aims to find a weighted subset from the unlabeled data pool such that the corresponding training loss function approximates its full data pool counterpart. We realize the general framework as a sparsity-constrained discontinuous optimization problem that explicitly balances uncertainty and representation for large-scale applications, for which we propose both greedy and iterative hard thresholding schemes. The proposed method can adapt to various settings, including both Bayesian and non-Bayesian neural networks. Numerical experiments show that that SABAL achieves state-of-the-art performance across different settings with lower computational complexity.

Tasks

Reproductions