File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

임성훈

Lim, Sunghoon
Industrial Intelligence Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Applying Multistep Classification Techniques With Pre-Classification to Recognize Static and Dynamic Hand Gestures Using a Soft Sensor-Embedded Glove

Author(s)
Jeon, SujinPark, SoyeonBae, JoonbumLim, Sunghoon
Issued Date
2024-10
DOI
10.1109/JSEN.2024.3445128
URI
https://scholarworks.unist.ac.kr/handle/201301/84277
Citation
IEEE SENSORS JOURNAL, v.24, no.19, pp.30668 - 30679
Abstract
Hand gestures have been widely used as an efficient information source for human-computer interaction (HCI) and apply to numerous fields, such as sign language translation and virtual reality (VR). Existing research on hand gesture recognition mainly considers either static gestures or dynamic gestures as gesture classes. However, simultaneously distinguishing both static and dynamic gestures is crucial for using hand gesture recognition systems in the real world because humans perform both types of gestures, not just one type in general. In this research, a multistep classification technique with a novel step called pre-classification is applied to simultaneously segment gesture patterns and classify static and dynamic gestures using a soft sensor-embedded glove, which measures the metacarpophalangeal (MCP) and proximal interphalangeal (PIP) joint angles of each finger every 15 ms. Specifically, the proposed hand gesture recognition system classifies gesture patterns as static or dynamic with pre-classification to handle both gestures with separately designed classifiers in one system for higher recognition performance. Combined with the gesture segmentation step, pre-classification shows reliable performance when a user performs various gesture types continuously. The proposed system’s effectiveness is validated through experiments with ten samples from eight subjects continuously performing eight static gestures and 11 dynamic gestures based on American Sign Language (ASL). The proposed system achieved 99.19% and 99.20% classification accuracy on static and dynamic gestures, respectively, and the lowest error in gesture segmentation than existing methods. This work can be applied to various application areas where high recognition performances are required, such as VR workforce training in manufacturing and medicine.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
ISSN
1530-437X
Keyword (Author)
soft sensorsData glovesdeep learningwearable sensorsgesture recognition

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.