File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

임정호

Im, Jungho
Intelligent Remote sensing and geospatial Information Science Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Enhancing tropical cyclone intensity forecasting with explainable deep learning integrating satellite observations and numerical model outputs

Author(s)
Lee, JuhyunIm, JunghoShin, Yeji
Issued Date
2024-06
DOI
10.1016/j.isci.2024.109905
URI
https://scholarworks.unist.ac.kr/handle/201301/83650
Citation
ISCIENCE, v.27, no.6, pp.109905
Abstract
Tropical cyclone (TC) intensity change forecasting remains challenging due to the lack of understanding of the interactions between TC changes and environmental parameters, and the high uncertainties resulting from climate change. This study proposed hybrid convolutional neural networks (hybrid-CNN), which effectively combined satellite-based spatial characteristics and numerical prediction model outputs, to forecast TC intensity with lead times of 24, 48, and 72 h. The models were validated against best track data by TC category and phase and compared with the Korea Meteorological Administrator (KMA)-based TC forecasts. The hybrid-CNN-based forecasts outperformed KMA-based forecasts, exhibiting up to 22%, 110%, and 7% improvement in skill scores for the 24-, 48-, and 72-h forecasts, respectively. For rapid intensification cases, the models exhibited improvements of 62%, 87%, and 50% over KMA-based forecasts for the three lead times. Moreover, explainable deep learning demonstrated hybrid-CNN's potential in predicting TC intensity and contributing to the TC forecasting field.
Publisher
CELL PRESS
ISSN
2589-0042
Keyword
RAPID INTENSIFICATIONVERTICAL SHEARPART IPREDICTIONIMPACTSCHEMETRACKINITIALIZATIONTYPHOON ACTIVITYCONVOLUTIONAL NEURAL-NETWORKS

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.