[2308.03686] Almost $d$-Linear Convergence Bounds for Diffusion Fashions through Stochastic Localization


Obtain a PDF of the paper titled Almost $d$-Linear Convergence Bounds for Diffusion Fashions through Stochastic Localization, by Joe Benton and three different authors

Obtain PDF
HTML (experimental)

Summary:Denoising diffusions are a robust technique to generate approximate samples from high-dimensional knowledge distributions. Latest outcomes present polynomial bounds on their convergence charge, assuming $L^2$-accurate scores. Till now, the tightest bounds have been both superlinear within the knowledge dimension or required robust smoothness assumptions. We offer the primary convergence bounds that are linear within the knowledge dimension (as much as logarithmic components) assuming solely finite second moments of the information distribution. We present that diffusion fashions require at most $tilde O(frac{d log^2(1/delta)}{varepsilon^2})$ steps to approximate an arbitrary distribution on $mathbb{R}^d$ corrupted with Gaussian noise of variance $delta$ to inside $varepsilon^2$ in KL divergence. Our proof extends the Girsanov-based strategies of earlier works. We introduce a refined remedy of the error from discretizing the reverse SDE impressed by stochastic localization.

Submission historical past

From: Joe Benton [view email]
Mon, 7 Aug 2023 16:01:14 UTC (152 KB)
Thu, 18 Jan 2024 14:54:37 UTC (173 KB)
Wed, 6 Mar 2024 00:41:30 UTC (174 KB)

Supply hyperlink


Please enter your comment!
Please enter your name here