File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

오현동

Oh, Hyondong
Autonomous Systems Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.number 11 -
dc.citation.startPage 2523 -
dc.citation.title SENSORS -
dc.citation.volume 19 -
dc.contributor.author Cho, Gangik -
dc.contributor.author Kim, Jongyun -
dc.contributor.author Oh, Hyondong -
dc.date.accessioned 2023-12-21T19:07:47Z -
dc.date.available 2023-12-21T19:07:47Z -
dc.date.created 2019-06-05 -
dc.date.issued 2019-06 -
dc.description.abstract Due to payload restrictions for micro aerial vehicles (MAVs), vision-based approaches have been widely studied with their light weight characteristics and cost effectiveness. In particular, optical flow-based obstacle avoidance has proven to be one of the most efficient methods in terms of obstacle avoidance capabilities and computational load; however, existing approaches do not consider 3-D complex environments. In addition, most approaches are unable to deal with situations where there are wall-like frontal obstacles. Although some algorithms consider wall-like frontal obstacles, they cause a jitter or unnecessary motion. To address these limitations, this paper proposes a vision-based obstacle avoidance algorithm for MAVs using the optical flow in 3-D textured environments. The image obtained from a monocular camera is first split into two horizontal and vertical half planes. The desired heading direction and climb rate are then determined by comparing the sum of optical flows between half planes horizontally and vertically, respectively, for obstacle avoidance in 3-D environments. Besides, the proposed approach is capable of avoiding wall-like frontal obstacles by considering the divergence of the optical flow at the focus of expansion and navigating to the goal position using a sigmoid weighting function. The performance of the proposed algorithm was validated through numerical simulations and indoor flight experiments in various situations. -
dc.identifier.bibliographicCitation SENSORS, v.19, no.11, pp.2523 -
dc.identifier.doi 10.3390/s19112523 -
dc.identifier.issn 1424-8220 -
dc.identifier.scopusid 2-s2.0-85067188145 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/26829 -
dc.identifier.url https://www.mdpi.com/1424-8220/19/11/2523 -
dc.identifier.wosid 000472133300101 -
dc.language 영어 -
dc.publisher Multidisciplinary Digital Publishing Institute (MDPI) -
dc.title Vision-Based Obstacle Avoidance Strategies for MAVs Using Optical Flows in 3-D Textured Environments -
dc.type Article -
dc.description.isOpenAccess TRUE -
dc.relation.journalWebOfScienceCategory Chemistry, Analytical; Engineering, Electrical & Electronic; Instruments & Instrumentation -
dc.relation.journalResearchArea Chemistry; Engineering; Instruments & Instrumentation -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor vision-based obstacle avoidance -
dc.subject.keywordAuthor optical flow -
dc.subject.keywordAuthor Horn-Schunck method -
dc.subject.keywordAuthor focus of expansion -
dc.subject.keywordAuthor micro aerial vehicle -
dc.subject.keywordPlus NAVIGATION -
dc.subject.keywordPlus FLIGHT -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.