File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이용재

Lee, Yongjae
Financial Engineering Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace SP -
dc.citation.title International Conference on Artificial Intelligence and Statistics -
dc.contributor.author Bae, Hyunglip -
dc.contributor.author Lee, Jinkyu -
dc.contributor.author Kim, Woo Chang -
dc.contributor.author Lee, Yongjae -
dc.date.accessioned 2024-01-31T19:06:39Z -
dc.date.available 2024-01-31T19:06:39Z -
dc.date.created 2023-07-18 -
dc.date.issued 2023-04-26 -
dc.description.abstract A neural networks-based stagewise decomposition algorithm called Deep Value Function Networks (DVFN) is proposed for large-scale multistage stochastic programming (MSP) problems. Traditional approaches such as nested Benders decomposition and its stochastic variant, stochastic dual dynamic programming (SDDP) approximates value functions as piecewise linear convex functions by gradually accumulating subgradient cuts from dual solutions of stagewise subproblems. Although they have been proven effective for linear problems, nonlinear problems may suffer from the increasing number of subgradient cuts as they proceed. A recently developed algorithm called Value Function Gradient Learning (VFGL) replaced the piecewise linear approximation with parametric function approximation, but its performance heavily depends upon the choice of parametric forms like most of traditional parametric machine learning algorithms did. On the other hand, DVFN approximates value functions using neural networks, which are known to have huge capacity in terms of their functional representations. The art of choosing appropriate parametric form becomes a simple labor of hyperparameter search for neural networks. However, neural networks are non-convex in general, and it would make the learning process unstable. We resolve this issue by using input convex neural networks that guarantee convexity with respect to inputs. We compare DVFN with SDDP and VFGL for solving large-scale linear and nonlinear MSP problems: production optimization and energy planning. Numerical examples clearly indicate that DVFN provide accurate and computationally efficient solutions. -
dc.identifier.bibliographicCitation International Conference on Artificial Intelligence and Statistics -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/74781 -
dc.identifier.url https://proceedings.mlr.press/v206/bae23a.html -
dc.publisher Society for Artificial Intelligence and Statistics -
dc.title Deep Value Function Networks for Large-Scale Multistage Stochastic Programs -
dc.type Conference Paper -
dc.date.conferenceDate 2023-04-25 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.