File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

임정호

Im, Jungho
Intelligent Remote sensing and geospatial Information Science Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 1029 -
dc.citation.number 5-3 -
dc.citation.startPage 1009 -
dc.citation.title KOREAN JOURNAL OF REMOTE SENSING -
dc.citation.volume 39 -
dc.contributor.author Bae, Sejeong -
dc.contributor.author Son, Bokyung -
dc.contributor.author Sung, Taejun -
dc.contributor.author Lee, Yeonsu -
dc.contributor.author Im, Jungho -
dc.contributor.author Kang, Yoojin -
dc.date.accessioned 2024-01-03T11:35:10Z -
dc.date.available 2024-01-03T11:35:10Z -
dc.date.created 2024-01-02 -
dc.date.issued 2023-10 -
dc.description.abstract Urban trees play a vital role in urban ecosystems,significantly reducing impervious surfaces and impacting carbon cycling within the city. Although previous research has demonstrated the efficacy of employing artificial intelligence in conjunction with airborne light detection and ranging (LiDAR) data to generate urban tree information, the availability and cost constraints associated with LiDAR data pose limitations. Consequently, this study employed freely accessible, high-resolution multispectral satellite imagery (i.e., Sentinel-2 data) to estimate fractional tree canopy cover (FTC) within the urban confines of Suwon, South Korea, employing machine learning techniques. This study leveraged a median composite image derived from a time series of Sentinel-2 images. In order to account for the diverse land cover found in urban areas, the model incorporated three types of input variables: average (mean) and standard deviation (std) values within a 30-meter grid from 10 m resolution of optical indices from Sentinel-2, and fractional coverage for distinct land cover classes within 30 m grids from the existing level 3 land cover map. Four schemes with different combinations of input variables were compared. Notably, when all three factors (i.e., mean, std, and fractional cover) were used to consider the variation of landcover in urban areas(Scheme 4, S4), the machine learning model exhibited improved performance compared to using only the mean of optical indices (Scheme 1). Of the various models proposed, the random forest (RF) model with S4 demonstrated the most remarkable performance, achieving R2 of 0.8196, and mean absolute error (MAE) of 0.0749, and a root mean squared error (RMSE) of 0.1022. The std variable exhibited the highest impact on model outputs within the heterogeneous land covers based on the variable importance analysis. This trained RF model with S4 was then applied to the entire Suwon region, consistently delivering robust results with an R2 of 0.8702, MAE of 0.0873, and RMSE of 0.1335. The FTC estimation method developed in this study is expected to offer advantages for application in various regions, providing fundamental data for a better understanding of carbon dynamics in urban ecosystems in the future. -
dc.identifier.bibliographicCitation KOREAN JOURNAL OF REMOTE SENSING, v.39, no.5-3, pp.1009 - 1029 -
dc.identifier.doi 10.7780/kjrs.2023.39.5.3.10 -
dc.identifier.issn 1225-6161 -
dc.identifier.scopusid 2-s2.0-85177602177 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/66463 -
dc.language 영어 -
dc.publisher KOREAN SOC REMOTE SENSING -
dc.title Estimation of Fractional Urban Tree Canopy Cover through Machine Learning Using Optical Satellite Images -
dc.type Article -
dc.description.isOpenAccess TRUE -
dc.type.docType Article -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Artificial intelligence -
dc.subject.keywordAuthor Remote sensing -
dc.subject.keywordAuthor Sentinel-2 -
dc.subject.keywordAuthor Tree area -
dc.subject.keywordAuthor Vegetation -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.