SOTAVerified

Non-asymptotic bounds for forward processes in denoising diffusions: Ornstein-Uhlenbeck is hard to beat

2024-08-25Unverified0· sign in to hype

Miha Brešar, Aleksandar Mijatović

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Denoising diffusion probabilistic models (DDPMs) represent a recent advance in generative modelling that has delivered state-of-the-art results across many domains of applications. Despite their success, a rigorous theoretical understanding of the error within DDPMs, particularly the non-asymptotic bounds required for the comparison of their efficiency, remain scarce. Making minimal assumptions on the initial data distribution, allowing for example the manifold hypothesis, this paper presents explicit non-asymptotic bounds on the forward diffusion error in total variation (TV), expressed as a function of the terminal time T. We parametrise multi-modal data distributions in terms of the distance R to their furthest modes and consider forward diffusions with additive and multiplicative noise. Our analysis rigorously proves that, under mild assumptions, the canonical choice of the Ornstein-Uhlenbeck (OU) process cannot be significantly improved in terms of reducing the terminal time T as a function of R and error tolerance >0. Motivated by data distributions arising in generative modelling, we also establish a cut-off like phenomenon (as R) for the convergence to its invariant measure in TV of an OU process, initialized at a multi-modal distribution with maximal mode distance R.

Tasks

Reproductions