< Back

VRHands3D

MSc. Thesis Project

Recent developments in HCI and VR/AR fields present novel interaction techniques along with sophisticated hardware and software. Despite such developments, mobile VR/AR still remains as one of the most accessible and cost-effective methods for common users to experience VR/AR. Conversely, mobile VR/AR does not provide an intuitive user input. With hand tracking, or hand pose estimation methods, bare hands can be incorporated using existing hardware (i.e. back-facing camera). However, 3D hand pose estimation from monocular RGB-only input in real-time on mobile VR/AR is not a trivial task. Inherent depth ambiguity of hand pose makes it very challenging to determine 3D joints without any depth information. Furthermore, achieving real-time performance on mobile devices where computational resources are limited. For this purpose, we proposed an end-to-end articulated 3D hand pose estimation method, called VRHands3D, that works in real-time on mobile devices.