Skip to Main content Skip to Navigation
Conference papers

DeepHPS: End-to-end Estimation of 3D Hand Pose and Shape by Learning from Synthetic Depth

Abstract : Articulated hand pose and shape estimation is an important problem for vision-based applications such as augmented reality and animation. In contrast to the existing methods which optimize only for joint positions, we propose a fully supervised deep network which learns to jointly estimate a full 3D hand mesh representation and pose from a single depth image. To this end, a CNN architecture is employed to estimate parametric representations i.e. hand pose, bone scales and complex shape parameters. Then, a novel hand pose and shape layer, embedded inside our deep framework, produces 3D joint positions and hand mesh. Lack of sufficient training data with varying hand shapes limits the generalized performance of learning based methods. Also, manually annotating real data is suboptimal. Therefore, we present SynHand5M: a million-scale synthetic dataset with accurate joint annotations, segmentation masks and mesh files of depth maps. Among model based learning (hybrid) methods, we show improved results on our dataset and two of the public benchmarks i.e. NYU and ICVL. Also, by employing a joint training strategy with real and synthetic data, we recover 3D hand mesh and pose from real images in 3.7ms.
Document type :
Conference papers
Complete list of metadata

https://hal-uphf.archives-ouvertes.fr/hal-03383020
Contributor : Kathleen Torck Connect in order to contact the contributor
Submitted on : Monday, October 18, 2021 - 2:01:36 PM
Last modification on : Wednesday, November 3, 2021 - 8:45:55 AM

Links full text

Identifiers

Collections

Citation

Muhammad Jameel Nawaz Malik, Ahmed Elhayek, Fabrizio Nunnari, Kiran Varanasi, Kiarash Tamaddon, et al.. DeepHPS: End-to-end Estimation of 3D Hand Pose and Shape by Learning from Synthetic Depth. 2018 International Conference on 3D Vision (3DV), Sep 2018, Verona, Italy. pp.110-119, ⟨10.1109/3DV.2018.00023⟩. ⟨hal-03383020⟩

Share

Metrics

Record views

5