File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이슬기

Lee, Seulki
Embedded Artificial Intelligence Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Bayesian Code Diffusion for Efficient Automatic Deep Learning Program Optimization

Author(s)
Jeong, IsuLee, Seulki
Issued Date
2025-07-08
URI
https://scholarworks.unist.ac.kr/handle/201301/87445
Fulltext
https://www.usenix.org/conference/osdi25/presentation/jeong
Citation
USENIX Symposium on Operating Systems Design and Implementation, pp.295 - 311
Abstract
We introduce Bayesian code diffusion, a new deep learning program optimization strategy devised to accelerate the auto-tuning process of deep learning compilers. By using the concepts of prior and posterior distributions in the Bayesian framework and reformulating them to the context of deep learning program optimization, the proposed approach efficiently searches for optimal program code in a significantly reduced search space through an iterative diffusion of program code. To further enhance the efficiency of program optimization, we propose pre-training and fine-tuning of the cost model, which improves both the model's predictive accuracy and training efficiency. We implement Bayesian code diffusion in Ansor and evaluate its performance on a wide range of deep learning models on both CPUs and GPUs. Existing approaches struggle to reliably generate high-performing deep learning programs, ie, achieving low program execution latency, across various configurations, including diverse deep learning model architectures and hardware platforms (CPU and GPU). In contrast, Bayesian code diffusion reduces the end-to-end compilation (optimization) time required to generate the equivalent program execution latency on various setups, eg, achieving up to 3.31x optimization speedup. This substantial improvement demonstrates that Bayesian code diffusion performs efficient and principled deep learning program optimization across a wide range of deep learning models, operators, and hardware (CPU and GPU).
Publisher
USENIX

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.