File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이용재

Lee, Yongjae
Financial Engineering Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Deep Value Function Networks for Large-Scale Multistage Stochastic Programs

Author(s)
Bae, HyunglipLee, JinkyuKim, Woo ChangLee, Yongjae
Issued Date
2023-04-26
URI
https://scholarworks.unist.ac.kr/handle/201301/74781
Fulltext
https://proceedings.mlr.press/v206/bae23a.html
Citation
International Conference on Artificial Intelligence and Statistics
Abstract
A neural networks-based stagewise decomposition algorithm called Deep Value Function Networks (DVFN) is proposed for large-scale multistage stochastic programming (MSP) problems. Traditional approaches such as nested Benders decomposition and its stochastic variant, stochastic dual dynamic programming (SDDP) approximates value functions as piecewise linear convex functions by gradually accumulating subgradient cuts from dual solutions of stagewise subproblems. Although they have been proven effective for linear problems, nonlinear problems may suffer from the increasing number of subgradient cuts as they proceed. A recently developed algorithm called Value Function Gradient Learning (VFGL) replaced the piecewise linear approximation with parametric function approximation, but its performance heavily depends upon the choice of parametric forms like most of traditional parametric machine learning algorithms did. On the other hand, DVFN approximates value functions using neural networks, which are known to have huge capacity in terms of their functional representations. The art of choosing appropriate parametric form becomes a simple labor of hyperparameter search for neural networks. However, neural networks are non-convex in general, and it would make the learning process unstable. We resolve this issue by using input convex neural networks that guarantee convexity with respect to inputs. We compare DVFN with SDDP and VFGL for solving large-scale linear and nonlinear MSP problems: production optimization and energy planning. Numerical examples clearly indicate that DVFN provide accurate and computationally efficient solutions.
Publisher
Society for Artificial Intelligence and Statistics

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.