File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 8828 -
dc.citation.number 12 -
dc.citation.startPage 8818 -
dc.citation.title IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS -
dc.citation.volume 18 -
dc.contributor.author Lee, Minhyuk -
dc.contributor.author Bae, Joonbum -
dc.date.accessioned 2023-12-21T13:17:14Z -
dc.date.available 2023-12-21T13:17:14Z -
dc.date.created 2022-02-20 -
dc.date.issued 2022-12 -
dc.description.abstract Recognizing dynamic gestures from a continuous data stream has been treated as a difficult task. One challenge is the gesture spotting problem, or segmentation of a gesture pattern from a data stream containing consecutive gestures. Although various types of gesture spotting strategies have been introduced so far, all methods has its own limitations to be applied to real world situations such as sign language conversation. One of the conditions that must be satisfied for practical applications is to detect repetitive gestures for emphasizing a certain meaning. In the view of gesture spotting, repetitive gestures are hard to be detected because similar starting and ending moments are repeated without constraints of the number of repetitions. We introduce a data glove-based real-time dynamic gesture recognition system that covers both repetitive and non-repetitive gestures. The proposed system consists of three steps: gesture spotting, gesture sequence compression, and gesture recognition. For gesture spotting, we define a quantity termed as a gesture progress sequence (GPS) which quantifies a progress of a gesture using numbers from 0 to 1. Gesture sequence compression removes variations including the number of repetitions that seek to impart the same message. At the gesture recognition step, the compressed gesture patterns are classified. The proposed system was evaluated using 17 American sign language (ASL) gestures and some ASL sentences. -
dc.identifier.bibliographicCitation IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, v.18, no.12, pp.8818 - 8828 -
dc.identifier.doi 10.1109/TII.2022.3152214 -
dc.identifier.issn 1551-3203 -
dc.identifier.scopusid 2-s2.0-85125294952 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/57300 -
dc.identifier.wosid 000862429800046 -
dc.language 영어 -
dc.publisher Institute of Electrical and Electronics Engineers -
dc.title Real-time Gesture Recognition in the view of Repeating Characteristics of Sign Languages -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Automation & Control Systems;Computer Science, Interdisciplinary Applications;Engineering, Industrial -
dc.relation.journalResearchArea Automation & Control Systems;Computer Science;Engineering -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Heuristic algorithms -
dc.subject.keywordAuthor Real-time systems -
dc.subject.keywordAuthor Feature extraction -
dc.subject.keywordAuthor Shape -
dc.subject.keywordAuthor Hidden Markov models -
dc.subject.keywordAuthor Artificial neural networks -
dc.subject.keywordAuthor gesture recognition -
dc.subject.keywordAuthor pattern recognition -
dc.subject.keywordAuthor sign language -
dc.subject.keywordAuthor supervised learning -
dc.subject.keywordAuthor wearable sensors -
dc.subject.keywordAuthor Gesture recognition -
dc.subject.keywordAuthor Assistive technologies -
dc.subject.keywordPlus HAND POSTURE -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.