SOTAVerified

Deeper Connections between Neural Networks and Gaussian Processes Speed-up Active Learning

2019-02-27Code Available0· sign in to hype

Evgenii Tsymbalov, Sergei Makarychev, Alexander Shapeev, Maxim Panov

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Active learning methods for neural networks are usually based on greedy criteria which ultimately give a single new design point for the evaluation. Such an approach requires either some heuristics to sample a batch of design points at one active learning iteration, or retraining the neural network after adding each data point, which is computationally inefficient. Moreover, uncertainty estimates for neural networks sometimes are overconfident for the points lying far from the training sample. In this work we propose to approximate Bayesian neural networks (BNN) by Gaussian processes, which allows us to update the uncertainty estimates of predictions efficiently without retraining the neural network, while avoiding overconfident uncertainty prediction for out-of-sample points. In a series of experiments on real-world data including large-scale problems of chemical and physical modeling, we show superiority of the proposed approach over the state-of-the-art methods.

Tasks

Reproductions