File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

조경화

Cho, Kyung Hwa
Water-Environmental Informatics Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

An open-source deep learning model for predicting effluent concentration in capacitive deionization

Author(s)
Son, MoonYoon, NakyungPark, SanghunAbbas, AtherCho, Kyung Hwa
Issued Date
2023-01
DOI
10.1016/j.scitotenv.2022.159158
URI
https://scholarworks.unist.ac.kr/handle/201301/60031
Citation
SCIENCE OF THE TOTAL ENVIRONMENT, v.856, no.2, pp.159158
Abstract
To effectively evaluate the performance of capacitive deionization (CDI), an electrochemical ion separation technol-ogy, it is necessary to accurately estimate the number of ions removed (effluent concentration) according to energy consumption. Herein, we propose and evaluate a deep learning model for predicting the effluent concentration of a CDI process. The developed deep learning model exhibited excellent prediction accuracy for both constant current and constant voltage modes (R2 >= 0.968), and the accuracy increased with the data size. This model was based on the open-source language, Python, and the code has since been distributed with proper instructions for general use. Owing to the nature of the data-oriented deep learning model, the findings of this study are not only applicable to conventional CDI but also to various types of CDI (membrane CDI, flow CDI, faradaic CDI, etc.). Therefore, by referring to the examples shown in this study, we hope that this open-source deep learning code will be widely used in CDI research.
Publisher
ELSEVIER
ISSN
0048-9697
Keyword (Author)
Deep learningNeural networksPythonCapacitive deionizationEffluent conductivity
Keyword
ENERGY-CONSUMPTIONDESALINATIONCDI

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.