File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

조경화

Cho, Kyung Hwa
Water-Environmental Informatics Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Dual-stage attention-based LSTM for simulating performance of brackish water treatment plant

Author(s)
Yoon, NakyungKim, JihyeLim, Jae-LimAbbas, AtherJeong, KwanhoCho, Kyung Hwa
Issued Date
2021-09
DOI
10.1016/j.desal.2021.115107
URI
https://scholarworks.unist.ac.kr/handle/201301/53123
Fulltext
https://www.sciencedirect.com/science/article/pii/S0011916421001788?via%3Dihub
Citation
DESALINATION, v.512, pp.115107
Abstract
The remarkable increment in the demand for freshwater in water-resource-stressed regions increases the necessity of saltwater desalination and the application of a brackish water treatment plant (BWTP). In that respect, model-based process analysis can play an essential role in optimizing BWTP operation and maintenance (O&M) and reducing costs. In modeling, it is challenging for either theoretical or numerical methods to sufficiently account for the complex causality and various correlations among the numerous process parameters or variables in the BWTP system. Contrastively, deep learning approaches are capable of modeling such a BWTP system as it can describe the complexity and nonlinearity of its variables with robust autonomous learning. In this study, we modeled an RO unit process of BWTP using conventional long short-term memory (Conv-LSTM) and dual-stage attention-based LSTM (DA-LSTM) based on hourly time-series data obtained from the actual BWTP operation during a one-year period. Hyperparameter optimization for Conv-LSTM and DA-LSTM was individually conducted to enhance the model prediction performance. The model prediction results demonstrated the superiority of DA-LSTM (R2 0.99) over Conv-LSTM (0.531 < R2 < 0.884). The sensitivity analysis offered straightforward interpretations of how the attention mechanisms of DA-LSTM used time-series data of the model input and output parameters for prediction.
Publisher
ELSEVIER
ISSN
0011-9164
Keyword (Author)
Long short-term memory (LSTM)Dual-stage attention-based LSTM (DA-LSTM)Deep neural networks (DNN)Brackish water reverse osmosis (BWRO)
Keyword
SHORT-TERM-MEMORYMATHEMATICAL-MODELNEURAL-NETWORKSPREDICTIONMECHANISMOPTIMIZATION

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.