SOTAVerified

Rates of convergence for density estimation with generative adversarial networks

2021-01-30Unverified0· sign in to hype

Nikita Puchkin, Sergey Samsonov, Denis Belomestny, Eric Moulines, Alexey Naumov

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

In this work we undertake a thorough study of the non-asymptotic properties of the vanilla generative adversarial networks (GANs). We prove an oracle inequality for the Jensen-Shannon (JS) divergence between the underlying density p^* and the GAN estimate with a significantly better statistical error term compared to the previously known results. The advantage of our bound becomes clear in application to nonparametric density estimation. We show that the JS-divergence between the GAN estimate and p^* decays as fast as (n/n)^2/(2 + d), where n is the sample size and determines the smoothness of p^*. This rate of convergence coincides (up to logarithmic factors) with minimax optimal for the considered class of densities.

Tasks

Reproductions