SOTAVerified

Discovering Discrete Latent Topics with Neural Variational Inference

2017-06-01ICML 2017Code Available0· sign in to hype

Yishu Miao, Edward Grefenstette, Phil Blunsom

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Topic models have been widely explored as probabilistic generative models of documents. Traditional inference methods have sought closed-form derivations for updating the models, however as the expressiveness of these models grows, so does the difficulty of performing fast and accurate inference over their parameters. This paper presents alternative neural approaches to topic modelling by providing parameterisable distributions over topics which permit training by backpropagation in the framework of neural variational inference. In addition, with the help of a stick-breaking construction, we propose a recurrent network that is able to discover a notionally unbounded number of topics, analogous to Bayesian non-parametric topic models. Experimental results on the MXM Song Lyrics, 20NewsGroups and Reuters News datasets demonstrate the effectiveness and efficiency of these neural topic models.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
20NewsGroupsGSMC_v0.55Unverified
AG NewsGSMC_v0.41Unverified

Reproductions