File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

오현동

Oh, Hyondong
Autonomous Systems Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 175 -
dc.citation.startPage 164 -
dc.citation.title INTERNATIONAL JOURNAL OF AERONAUTICAL AND SPACE SCIENCES -
dc.citation.volume 25 -
dc.contributor.author Bae, Seonguk -
dc.contributor.author Shin, Heejung -
dc.contributor.author Kim, Hyeongseop -
dc.contributor.author Park, Minkyu -
dc.contributor.author Choi, Myong-Yol -
dc.contributor.author Oh, Hyondong -
dc.date.accessioned 2023-12-21T12:36:47Z -
dc.date.available 2023-12-21T12:36:47Z -
dc.date.created 2023-07-24 -
dc.date.issued 2024-01 -
dc.description.abstract This paper proposes an object detection algorithm based on a deep neural network which utilizes RGB and infrared (IR) images for human detection with drones. Although there are some public RGB image datasets in aerial view, there is no publicly available dataset in which both RGB and IR images are taken simultaneously with drones. Thus, we collect RGB and IR images at various altitudes on our own. However, the detection performance of the RGB and IR-based algorithm is limited, because there are more available RGB images for training than IR images or RGB and IR images taken simultaneously. To address this data imbalance, we use a generative adversarial neural network model called CycleGAN to generate IR images from RGB images in the public and self-collected dataset. Furthermore, the neural network for object detection is accelerated by neural network quantization and optimization to execute the algorithm faster on an embedded computing board. To shorten the search time and improve the robustness of detection, we employ a formation flight of multiple drones for human detection. The effectiveness of the integrated system of formation flight and onboard human detection algorithm is validated by the real flight experiment to find humans who are in a wide and wild area. -
dc.identifier.bibliographicCitation INTERNATIONAL JOURNAL OF AERONAUTICAL AND SPACE SCIENCES, v.25, pp.164 - 175 -
dc.identifier.doi 10.1007/s42405-023-00632-1 -
dc.identifier.issn 2093-274X -
dc.identifier.scopusid 2-s2.0-85163005337 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/64990 -
dc.identifier.wosid 001017551500001 -
dc.language 영어 -
dc.publisher SPRINGER -
dc.title Deep Learning-Based Human Detection Using RGB and IR Images from Drones -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Engineering, Aerospace -
dc.relation.journalResearchArea Engineering -
dc.type.docType Article; Early Access -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.description.journalRegisteredClass kci -
dc.subject.keywordAuthor CycleGAN -
dc.subject.keywordAuthor Formation flight -
dc.subject.keywordAuthor Infrared image -
dc.subject.keywordAuthor Neural network acceleration -
dc.subject.keywordAuthor Neural network quantization -
dc.subject.keywordAuthor Human detection -
dc.subject.keywordPlus SEARCH -
dc.subject.keywordPlus RESCUE -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.