SOTAVerified

Sequential Neural Models with Stochastic Layers

2016-05-24NeurIPS 2016Code Available0· sign in to hype

Marco Fraccaro, Søren Kaae Sønderby, Ulrich Paquet, Ole Winther

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural generative model. The clear separation of deterministic and stochastic layers allows a structured variational inference network to track the factorization of the model's posterior distribution. By retaining both the nonlinear recursive structure of a recurrent neural network and averaging over the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling.

Tasks

Reproductions