Full metadata record
DC Field | Value | Language |
---|---|---|
dc.citation.endPage | 219933 | - |
dc.citation.startPage | 219923 | - |
dc.citation.title | IEEE ACCESS | - |
dc.citation.volume | 8 | - |
dc.contributor.author | Lee, Minhyuk | - |
dc.contributor.author | Bae, Joonbum | - |
dc.date.accessioned | 2023-12-21T16:44:03Z | - |
dc.date.available | 2023-12-21T16:44:03Z | - |
dc.date.created | 2020-11-17 | - |
dc.date.issued | 2020-11 | - |
dc.description.abstract | In this article, a real-time dynamic finger gesture recognition using a soft sensor embedded data glove is presented, which measures the metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles of five fingers. In the gesture recognition field, a challenging problem is that of separating meaningful dynamic gestures from a continuous data stream. Unconscious hand motions or sudden tremors, which can easily lead to segmentation ambiguity, makes this problem difficult. Furthermore, the hand shapes and speeds of users differ when performing the same dynamic gesture, and even those made by one user often vary. To solve the problem of separating meaningful dynamic gestures, we propose a deep learning-based gesture spotting algorithm that detects the start/end of a gesture sequence in a continuous data stream. The gesture spotting algorithm takes window data and estimates a scalar value named gesture progress sequence (GPS). GPS is a quantity that represents gesture progress. Moreover, to solve the gesture variation problem, we propose a sequence simplification algorithm and a deep learning-based gesture recognition algorithm. The proposed three algorithms (gesture spotting algorithm, sequence simplification algorithm, and gesture recognition algorithm) are unified into the real-time gesture recognition system and the system was tested with 11 dynamic finger gestures in real-time. The proposed system took only 6 ms to estimate a GPS and no more than 12 ms to recognize the completed gesture in real-time. | - |
dc.identifier.bibliographicCitation | IEEE ACCESS, v.8, pp.219923 - 219933 | - |
dc.identifier.doi | 10.1109/ACCESS.2020.3039401 | - |
dc.identifier.issn | 2169-3536 | - |
dc.identifier.scopusid | 2-s2.0-85096832147 | - |
dc.identifier.uri | https://scholarworks.unist.ac.kr/handle/201301/48766 | - |
dc.identifier.url | https://ieeexplore.ieee.org/document/9264164 | - |
dc.identifier.wosid | 000600316900001 | - |
dc.language | 영어 | - |
dc.publisher | IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC | - |
dc.title | Deep Learning based Real-time Recognition of Dynamic Finger Gestures using a Data Glove | - |
dc.type | Article | - |
dc.description.isOpenAccess | TRUE | - |
dc.relation.journalWebOfScienceCategory | Computer Science, Information Systems; Engineering, Electrical & Electronic; Telecommunications | - |
dc.relation.journalResearchArea | Computer Science; Engineering; Telecommunications | - |
dc.type.docType | Article | - |
dc.description.journalRegisteredClass | scie | - |
dc.description.journalRegisteredClass | scopus | - |
dc.subject.keywordAuthor | Gesture recognition | - |
dc.subject.keywordAuthor | Heuristic algorithms | - |
dc.subject.keywordAuthor | Data gloves | - |
dc.subject.keywordAuthor | Dynamics | - |
dc.subject.keywordAuthor | Real-time systems | - |
dc.subject.keywordAuthor | Global Positioning System | - |
dc.subject.keywordAuthor | Feature extraction | - |
dc.subject.keywordAuthor | Artificial neural network | - |
dc.subject.keywordAuthor | data glove | - |
dc.subject.keywordAuthor | data compression | - |
dc.subject.keywordAuthor | dynamic gesture recognition | - |
dc.subject.keywordAuthor | human-computer interaction | - |
dc.subject.keywordAuthor | pattern recognition | - |
dc.subject.keywordAuthor | real time system | - |
dc.subject.keywordAuthor | recurrent neural network | - |
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Tel : 052-217-1404 / Email : scholarworks@unist.ac.kr
Copyright (c) 2023 by UNIST LIBRARY. All rights reserved.
ScholarWorks@UNIST was established as an OAK Project for the National Library of Korea.