SOTAVerified

Mean Field Theory of Activation Functions in Deep Neural Networks

2018-05-22Code Available0· sign in to hype

Mirco Milletarí, Thiparat Chotibut, Paolo E. Trevisanutto

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We present a Statistical Mechanics (SM) model of deep neural networks, connecting the energy-based and the feed forward networks (FFN) approach. We infer that FFN can be understood as performing three basic steps: encoding, representation validation and propagation. From the meanfield solution of the model, we obtain a set of natural activations -- such as Sigmoid, and ReLu -- together with the state-of-the-art, Swish; this represents the expected information propagating through the network and tends to ReLu in the limit of zero noise.We study the spectrum of the Hessian on an associated classification task, showing that Swish allows for more consistent performances over a wider range of network architectures.

Tasks

Reproductions