Uncertainty Estimations by Softplus normalization in Bayesian Convolutional Neural Networks with Variational Inference
Kumar Shridhar, Felix Laumann, Marcus Liwicki
Code Available — Be the first to reproduce this paper.
ReproduceCode
- github.com/liqichen6688/baycnnpytorch★ 0
- github.com/kumar-shridhar/PyTorch-BayesianCNNpytorch★ 0
- github.com/kumar-shridhar/BayesianConvNetpytorch★ 0
- github.com/Anou9531/Bayesian-CNNpytorch★ 0
- github.com/MindSpore-scientific-2/code-8/tree/main/Uncertainty_Calibration_Object_Detectionmindspore★ 0
- github.com/nomercy77/Implementing-Bayesian-CNNpytorch★ 0
- github.com/MindSpore-scientific-2/code-1/tree/main/Uncertainty_Calibration_Object_Detectionmindspore★ 0
- github.com/MindSpore-scientific-2/code-11/tree/main/Uncertainty_Calibration_Object_Detectionmindspore★ 0
- github.com/MindSpore-scientific-2/code-2/tree/main/Uncertainty_Calibration_Object_Detectionmindspore★ 0
Abstract
We introduce a novel uncertainty estimation for classification tasks for Bayesian convolutional neural networks with variational inference. By normalizing the output of a Softplus function in the final layer, we estimate aleatoric and epistemic uncertainty in a coherent manner. The intractable posterior probability distributions over weights are inferred by Bayes by Backprop. Firstly, we demonstrate how this reliable variational inference method can serve as a fundamental construct for various network architectures. On multiple datasets in supervised learning settings (MNIST, CIFAR-10, CIFAR-100), this variational inference method achieves performances equivalent to frequentist inference in identical architectures, while the two desiderata, a measure for uncertainty and regularization are incorporated naturally. Secondly, we examine how our proposed measure for aleatoric and epistemic uncertainties is derived and validate it on the aforementioned datasets.