File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

임정호

Im, Jungho
Intelligent Remote sensing and geospatial Information Science Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Merging multiple sensing platforms and deep learning empowers individual tree mapping and species detection at the city scale

Author(s)
Kwon, RyoungseobRyu, YoungryelYang, TackangZhong, ZilongIm, Jungho
Issued Date
2023-12
DOI
10.1016/j.isprsjprs.2023.11.011
URI
https://scholarworks.unist.ac.kr/handle/201301/67559
Citation
ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, v.206, pp.201 - 221
Abstract
The precise estimation of the number of trees, their individual tree locations, along with species information, is crucial for enhancing ecosystem services in urban areas. Previous studies largely used satellite or airborne images for mapping trees, but they were insufficient to generate a large number of species distributions. Ground-level data also led to inaccurate positional information of individual trees, although they provided reliable species detection results. In this study, we propose a novel framework that fully explores the complementary strengths of air- and ground-level sensing platforms by leveraging various deep neural networks to generate a detailed tree maps at a city-wide scale. Our strategy includes individual tree mapping and tree species detection using threecolor channel images acquired from multiple sensing platforms. Through publicly available airborne imagery, we estimate the presence of over 1.2 million trees in Suwon city of South Korea spanning 121.04 km2 (R2 of 0.95 and relative bias of -1.9 %), achieving more accurate individual tree positions (positional uncertainty around 2.0 m) than conventional methods. Our comprehensive experiments also demonstrate the effectiveness of utilizing tree bark photos and street-level imagery taken by citizens and vehicles to identify urban tree species, with accuracy rates of over 80 % for citizen-sensed tree species maps and 66 % for vehicle-sensed tree species map. Along with the proliferation of web-based airborne images, the widespread use of smartphones, and the advancements in vehicle-mounted sensors, this study can facilitate efficient and accurate management of urban trees across scales. For reproducibility of the study, we share the source code and datasets at https://github.com/landkwon94/ ryoungseob-master.
Publisher
ELSEVIER
ISSN
0924-2716
Keyword (Author)
Urban trees mappingTree species detectionMulti-modal deep learningComputer vision
Keyword
GOOGLE STREET VIEWCITIZEN SCIENCECROWN DELINEATIONCLIMATE-CHANGECLASSIFICATIONLEVELAIRBORNEIMAGERYINFORMATIONFORESTS

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.