File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

배준범

Bae, Joonbum
Bio-robotics and Control Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.title IEEE ROBOTICS & AUTOMATION MAGAZINE -
dc.contributor.author Park, Sungman -
dc.contributor.author Kim, Junsoo -
dc.contributor.author Lee, Hojae -
dc.contributor.author Jo, Minwoong -
dc.contributor.author Gong, Dohoon -
dc.contributor.author Ju, Dawon -
dc.contributor.author Won, Dami -
dc.contributor.author Kim, Sihyeon -
dc.contributor.author Oh, Jinhyeok -
dc.contributor.author Jang, Hun -
dc.contributor.author Bae, Joonbum -
dc.date.accessioned 2024-01-12T02:05:08Z -
dc.date.available 2024-01-12T02:05:08Z -
dc.date.created 2023-08-30 -
dc.date.issued 2023-08 -
dc.description.abstract This article proposes an intuitive and immersive whole-body teleoperation system with motion-based control and multimodal feedback. The system consists of an anthropomorphic teleoperated robot and a haptic interface platform. The teleoperated robot has dual arms with dexterous hands, a head with a neck, and waist, giving it a human-like appearance and wide range of motion as well as an omnidirectional mobile platform for improved mobility. The haptic interface platform enables a human operator to control the robot intuitively by measuring the operator’s motion with a motion-capture system, providing haptic feedback to the user’s arms, fingers, and feet, and providing 3D image feedback. Additionally, facial animation further enhances immersion by synchronizing the facial expression of the robot with the user’s voice. The proposed teleoperation system offers a promising solution for the human-oriented robotic avatar system, which was verified through a global competition: the $10 M All Nippon Airways (ANA) Avatar XPRIZE. The system was successfully evaluated with 45 min of training time for users who were new to our system. And the lessons learned from the competition and future improvements are discussed. -
dc.identifier.bibliographicCitation IEEE ROBOTICS & AUTOMATION MAGAZINE -
dc.identifier.doi 10.1109/MRA.2023.3328512 -
dc.identifier.issn 1070-9932 -
dc.identifier.scopusid 2-s2.0-85178006318 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/68034 -
dc.identifier.wosid 001109203800001 -
dc.language 영어 -
dc.publisher Institute of Electrical and Electronics Engineers -
dc.title A Whole-Body Integrated AVATAR System: Implementation of Telepresence with Intuitive Control and Immersive Feedback -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Automation & Control Systems;Robotics -
dc.relation.journalResearchArea Automation & Control Systems;Robotics -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Robots -
dc.subject.keywordAuthor Robot sensing systems -
dc.subject.keywordAuthor Manipulators -
dc.subject.keywordAuthor Legged locomotion -
dc.subject.keywordAuthor Haptic interfaces -
dc.subject.keywordAuthor Sensors -
dc.subject.keywordAuthor Kinematics -
dc.subject.keywordPlus LOCO-MANIPULATION -
dc.subject.keywordPlus TELEOPERATION -
dc.subject.keywordPlus INTERFACE -
dc.subject.keywordPlus ROBOT -
dc.subject.keywordPlus ARM -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.