Building on the Introduction to Virtual Reality, OpenXR Hand-Tracking, and Gesture Recognition in Unreal Engine tutorial, this slightly more advanced tutorial will dive deeper into the following topics:
- Transitioning seamlessly between motion controller and hand-tracking modes in Unreal Engine.
- Adding custom debugging gizmos to improve development and testing workflows.
- Visualizing debug virtual hands by incorporating the custom gizmos.
- Animating virtual hand meshes with OpenXR hand-tracking data, moving beyond basic joint representation with cubes.
- Re-using and adapting the gesture recognition code from the introductory tutorial to integrate with the new animated virtual hand meshes.
This guide will help you take your VR projects to the next level with polished and practical implementations.
Table of Contents
English Video Tutorials
- The English video tutorial part 1:
- The English video tutorial part 2:
Farsi Video Tutorials
- The video tutorial in Persian part 1:
- The video tutorial in Persian part 2:
Project Source Files
-
The project source files on Microsoft Azure DevOps Repositories.
-
The project source files on GitLab.
See also
- Optimizing Unreal Engine VR Projects for Higher Framerates (Meta Quest, HTC VIVE, FFR, ETFR, NVIDIA DLSS, AMD FSR, and Intel XeSS Tips Included!)
- Building Unreal Engine 5.6 From the GitHub Source Code on GNU/Linux With Android Support
- Rust Devs Think We’re Hopeless; Let’s Prove Them Wrong (with C++ Memory Leaks)!
- Building Unreal Engine 5.6 From the GitHub Source Code on Microsoft Windows