SOTAVerified

Robust and Active Learning for Deep Neural Network Regression

2021-07-28Unverified0· sign in to hype

Xi Li, George Kesidis, David J. Miller, Maxime Bergeron, Ryan Ferguson, Vladimir Lucic

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

We describe a gradient-based method to discover local error maximizers of a deep neural network (DNN) used for regression, assuming the availability of an "oracle" capable of providing real-valued supervision (a regression target) for samples. For example, the oracle could be a numerical solver which, operationally, is much slower than the DNN. Given a discovered set of local error maximizers, the DNN is either fine-tuned or retrained in the manner of active learning.

Tasks

Reproductions