On stochastic gradient Langevin dynamics with dependent data streams: the fully non-convex case
2019-05-30Unverified0· sign in to hype
Ngoc Huy Chau, Éric Moulines, Miklos Rásonyi, Sotirios Sabanis, Ying Zhang
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
We consider the problem of sampling from a target distribution, which is not necessarily logconcave, in the context of empirical risk minimization and stochastic optimization as presented in Raginsky et al. (2017). Non-asymptotic analysis results are established in the L^1-Wasserstein distance for the behaviour of Stochastic Gradient Langevin Dynamics (SGLD) algorithms. We allow the estimation of gradients to be performed even in the presence of dependent data streams. Our convergence estimates are sharper and uniform in the number of iterations, in contrast to those in previous studies.