File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

임동영

Lim, Dong-Young
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

On diffusion-based generative models and their error bounds: The log-concave case with full convergence estimates

Author(s)
Bruno, StefanoZhang, YingLim, Dong-YoungAkyildiz, Ömer DenizSabanis, Sotirios
Issued Date
2025-02
DOI
10.48550/arXiv.2311.13584
URI
https://scholarworks.unist.ac.kr/handle/201301/86736
Citation
TRANSACTIONS ON MACHINE LEARNING RESEARCH (TMLR), v.2025
Abstract
We provide full theoretical guarantees for the convergence behaviour of diffusion-based generative models under the assumption of strongly log-concave data distributions while our approximating class of functions used for score estimation is made of Lipschitz continuous functions avoiding any Lipschitzness assumption on the score function. We demonstrate via a motivating example, sampling from a Gaussian distribution with unknown mean, the powerfulness of our approach. In this case, explicit estimates are provided for the associated optimization problem, i.e. score approximation, while these are combined with the corresponding sampling estimates. As a result, we obtain the best known upper bound estimates in terms of key quantities of interest, such as the dimension and rates of convergence, for the Wasserstein-2 distance between the data distribution (Gaussian with unknown mean) and our sampling algorithm. Beyond the motivating example and in order to allow for the use of a diverse range of stochastic optimizers, we present our results using an L2-accurate score estimation assumption, which crucially is formed under an expectation with respect to the stochastic optimizer and our novel auxiliary process that uses only known information. This approach yields the best known convergence rate for our sampling algorithm. © 2025, Transactions on Machine Learning Research. All rights reserved.
Publisher
OpenReview.net
ISSN
2835-8856

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.