File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

배준범

Bae, Joonbum
Bio-robotics and Control Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Real-time Gesture Recognition in the view of Repeating Characteristics of Sign Languages

Author(s)
Lee, MinhyukBae, Joonbum
Issued Date
2022-12
DOI
10.1109/TII.2022.3152214
URI
https://scholarworks.unist.ac.kr/handle/201301/57300
Citation
IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, v.18, no.12, pp.8818 - 8828
Abstract
Recognizing dynamic gestures from a continuous data stream has been treated as a difficult task. One challenge is the gesture spotting problem, or segmentation of a gesture pattern from a data stream containing consecutive gestures. Although various types of gesture spotting strategies have been introduced so far, all methods has its own limitations to be applied to real world situations such as sign language conversation. One of the conditions that must be satisfied for practical applications is to detect repetitive gestures for emphasizing a certain meaning. In the view of gesture spotting, repetitive gestures are hard to be detected because similar starting and ending moments are repeated without constraints of the number of repetitions. We introduce a data glove-based real-time dynamic gesture recognition system that covers both repetitive and non-repetitive gestures. The proposed system consists of three steps: gesture spotting, gesture sequence compression, and gesture recognition. For gesture spotting, we define a quantity termed as a gesture progress sequence (GPS) which quantifies a progress of a gesture using numbers from 0 to 1. Gesture sequence compression removes variations including the number of repetitions that seek to impart the same message. At the gesture recognition step, the compressed gesture patterns are classified. The proposed system was evaluated using 17 American sign language (ASL) gestures and some ASL sentences.
Publisher
Institute of Electrical and Electronics Engineers
ISSN
1551-3203
Keyword (Author)
Heuristic algorithmsReal-time systemsFeature extractionShapeHidden Markov modelsArtificial neural networksgesture recognitionpattern recognitionsign languagesupervised learningwearable sensorsGesture recognitionAssistive technologies
Keyword
HAND POSTURE

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.