SOTAVerified

Sequential Learning for Dirichlet Process Mixtures

2019-10-16pproximateinference AABI Symposium 2019Unverified0· sign in to hype

Chunlin Ji, Bin Liu, Yingkai Jiang, Ke Deng

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Dirichlet process mixture model provides a flexible nonparametric framework for unsupervised learning. Monte Carlo based sampling methods always involve heavy computation efforts; conventional variational inference requires careful design of the variational distribution and the conditional expectation. In this work, we treat the DP mixture itself as the variational proposal, and view the given data as drawn samples of the unknown target distribution. We propose an evidence upper bound (EUBO) to act as the surrogate loss, and fit a DP mixture to the given data by minimizing the EUBO, which is equivalent to minimizing the KL-divergence between the target distribution and the DP mixture. We provide three advantages of the EUBO based DP mixture fitting and show how to build the black-box style sequential learning algorithm. We use the stochastic gradient descent (SGD) algorithm for optimization that leverages on the automatic differentiation tools. Simulation studies are provided to demonstrate the efficiency of our proposed methods.

Tasks

Reproductions