SOTAVerified

Scalable Deep Unsupervised Clustering with Concrete GMVAEs

2019-09-18Unverified0· sign in to hype

Mark Collier, Hector Urdiales

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Discrete random variables are natural components of probabilistic clustering models. A number of VAE variants with discrete latent variables have been developed. Training such methods requires marginalizing over the discrete latent variables, causing training time complexity to be linear in the number clusters. By applying a continuous relaxation to the discrete variables in these methods we can achieve a reduction in the training time complexity to be constant in the number of clusters used. We demonstrate that in practice for one such method, the Gaussian Mixture VAE, the use of a continuous relaxation has no negative effect on the quality of the clustering but provides a substantial reduction in training time, reducing training time on CIFAR-100 with 20 clusters from 47 hours to less than 6 hours.

Tasks

Reproductions