SOTAVerified

Uncertainty Estimations by Softplus normalization in Bayesian Convolutional Neural Networks with Variational Inference

2018-06-15Code Available0· sign in to hype

Kumar Shridhar, Felix Laumann, Marcus Liwicki

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We introduce a novel uncertainty estimation for classification tasks for Bayesian convolutional neural networks with variational inference. By normalizing the output of a Softplus function in the final layer, we estimate aleatoric and epistemic uncertainty in a coherent manner. The intractable posterior probability distributions over weights are inferred by Bayes by Backprop. Firstly, we demonstrate how this reliable variational inference method can serve as a fundamental construct for various network architectures. On multiple datasets in supervised learning settings (MNIST, CIFAR-10, CIFAR-100), this variational inference method achieves performances equivalent to frequentist inference in identical architectures, while the two desiderata, a measure for uncertainty and regularization are incorporated naturally. Secondly, we examine how our proposed measure for aleatoric and epistemic uncertainties is derived and validate it on the aforementioned datasets.

Tasks

Reproductions