SOTAVerified

PAC-Bayesian Neural Network Bounds

2019-09-25Unverified0· sign in to hype

Yossi Adi, Alex Schwing, Tamir Hazan

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Bayesian neural networks, which both use the negative log-likelihood loss function and average their predictions using a learned posterior over the parameters, have been used successfully across many scientific fields, partly due to their ability to `effortlessly' extract desired representations from many large-scale datasets. However, generalization bounds for this setting is still missing. In this paper, we present a new PAC-Bayesian generalization bound for the negative log-likelihood loss which utilizes the Herbst Argument for the log-Sobolev inequality to bound the moment generating function of the learners risk.

Tasks

Reproductions