File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.contributor.advisor Baek, Seungryul -
dc.contributor.author Hor, Siekny -
dc.date.accessioned 2024-10-14T13:50:43Z -
dc.date.available 2024-10-14T13:50:43Z -
dc.date.issued 2024-08 -
dc.description.abstract Hand pose estimation has gained significant interest recently, leading to the development of various methods. Existing methods attempt to bit performance but often face efficiency challenges. In this work, we propose a lightweight graph-based network optimized for both accuracy and efficiency in 3D single- hand pose estimation. Our design leverages Chebyshev Graph Convolutions (ChebGConv) to streamline the 2D encoding process, reducing computational overhead. Additionally, we introduce a coarse-to-fine ChebGConv module in the 3D decoder to progressively refine the hand mesh reconstruction, enhancing accuracy. We also improve our model through ensemble distillation, transferring knowledge from high- performing teacher models. Notably efficient, our model has only 8.48M parameters and requires 1.7G FLOPs, achieving 55 FPS on a CPU and 109 FPS on a GPU. Despite its lightweight nature, our model demonstrates competitive accuracy, achieving a PA-MPJPE of 5.7mm and a PA-MPVPE of 5.9mm on the FreiHAND dataset, and a PA-MPJPE of 8.7mm and a PA-MPVPE of 8.9mm on the HO3D dataset. -
dc.description.degree Master -
dc.description Graduate School of Artificial Intelligence -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/84197 -
dc.identifier.uri http://unist.dcollection.net/common/orgView/200000813222 -
dc.language ENG -
dc.publisher Ulsan National Institute of Science and Technology -
dc.title Rethinking Fast and Accurate 3D Hand Pose Estimation -
dc.type Thesis -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.