File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

원종묵

Won, Jongmuk
Sustainable Smart Geotechnical Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Data-driven machine learning models for predicting engineering properties in deep-sea sediments

Author(s)
Yun, JungminPark, JungheeChoo, HyunwookLee, Hyung-MinWon, Jongmuk
Issued Date
2025-11
DOI
10.1038/s41598-025-29402-7
URI
https://scholarworks.unist.ac.kr/handle/201301/90291
Citation
SCIENTIFIC REPORTS, v.15, no.1, pp.44987
Abstract
Predicting the properties of deep-sea sediments offers critical insights into past oceanic conditions, including sediment composition, stratigraphy, and geochemical signals. However, accurate prediction is hindered by the high spatial variability of these sediments. This study presents a data-driven machine learning framework to predict five key sediment properties. Five prediction scenarios were developed with tailored preprocessing and hyperparameter tuning, and Shapley additive explanations were employed to assess feature importance and the relationships between depth and sediment properties. Among the five tested algorithms, the extreme gradient boosting (XGBoost) model achieved the highest predictive performance. Depth and compressional wave velocity emerged as the most and second most influential features for estimating porosity, grain density, calcite content, and thermal conductivity. The depth-dependent predictions with quantified uncertainties generated by the XGBoost model demonstrate that the proposed framework provides a robust approach for predicting deep-sea sediment properties.
Publisher
NATURE PORTFOLIO
ISSN
2045-2322
Keyword (Author)
Data-driven approachDeep-sea sedimentFeature importanceShapley additive explanationsMachine learning
Keyword
ARTIFICIAL NEURAL-NETWORKSNORTH-ATLANTICEXPEDITIONPOROSITYFLOORLIFE

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.