SOTAVerified

Convergence Rates of Latent Topic Models Under Relaxed Identifiability Conditions

2017-10-30Unverified0· sign in to hype

Yining Wang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this paper we study the frequentist convergence rate for the Latent Dirichlet Allocation (Blei et al., 2003) topic models. We show that the maximum likelihood estimator converges to one of the finitely many equivalent parameters in Wasserstein's distance metric at a rate of n^-1/4 without assuming separability or non-degeneracy of the underlying topics and/or the existence of more than three words per document, thus generalizing the previous works of Anandkumar et al. (2012, 2014) from an information-theoretical perspective. We also show that the n^-1/4 convergence rate is optimal in the worst case.

Tasks

Reproductions