File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

최진영

Choi, Jinyoung
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace SI -
dc.citation.title International Conference on Learning Representations -
dc.contributor.author Choi, Jinyoung -
dc.contributor.author Kang, Junoh -
dc.contributor.author Han, Bohyung -
dc.date.accessioned 2026-03-27T14:02:41Z -
dc.date.available 2026-03-27T14:02:41Z -
dc.date.created 2026-03-26 -
dc.date.issued 2025-04-24 -
dc.description.abstract Diffusion probabilistic models (DPMs), while effective in generating high-quality samples, often suffer from high computational costs due to their iterative sampling process. To address this, we propose an enhanced ODE-based sampling method for DPMs inspired by Richardson extrapolation, which reduces numerical error and improves convergence rates. Our method, RX-DPM, leverages multiple ODE solutions at intermediate time steps to extrapolate the denoised prediction in DPMs. This significantly enhances the accuracy of estimations for the final sample while maintaining the number of function evaluations (NFEs). Unlike standard Richardson extrapolation, which assumes uniform discretization of the time grid, we develop a more general formulation tailored to arbitrary time step scheduling, guided by local truncation error derived from a baseline sampling method. The simplicity of our approach facilitates accurate estimation of numerical solutions without significant computational overhead, and allows for seamless and convenient integration into various DPMs and solvers. Additionally, RX-DPM provides explicit error estimates, effectively demonstrating the faster convergence as the leading error term’s order increases. Through a series of experiments, we show that the proposed method improves the quality of generated samples without requiring additional sampling iterations. -
dc.identifier.bibliographicCitation International Conference on Learning Representations -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/91117 -
dc.identifier.url https://iclr.cc/virtual/2025/poster/28214 -
dc.language 영어 -
dc.publisher ICLR -
dc.title ENHANCED DIFFUSION SAMPLING VIA EXTRAPOLATION WITH MULTIPLE ODE SOLUTIONS -
dc.type Conference Paper -
dc.date.conferenceDate 2025-04-24 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.