Introducing High Frequency Hand Tracking

Oculus Developer Blog
|
Posted by Oculus VR
|
April 28, 2021
|
Share
Hand Tracking

It’s been a little over a year since we introduced Hand Tracking support. Since then, we've seen incredible innovation and integration in apps such as The Curious Tale of the Stolen Pets, Waltz of the Wizard, Vacation Simulator, Gloomy Eyes, The Line, and more.

Hand tracking enables the use of hands as an input method for Oculus Quest and can create a more immersive experience that is intuitive and particularly beneficial for social interactions. While the potential is great, there is room for improvement and it can be difficult to implement robust gestures.

Today we’re introducing High Frequency Hand Tracking, a new tracking mode that allows for better gesture detection and lower latencies. We are also updating our policy with regard to GPU and CPU consumption for apps using hand tracking. Note this feature is only available to Quest 2 users.

With High Frequency Hand Tracking, you can upgrade to more robust and reliable gesture detection, stable hand depiction, and latency improvements of about 10%. This improvement creates a more immersive experience and moves closer to a seamless replication of natural movement within VR. One current tradeoff of High Frequency tracking is a slight increase in jitter (particularly under low light conditions). We are working on addressing this and expect to reduce jitter back to previous levels in subsequent releases.

The update to High Frequency Hand Tracking is a manual process, requiring a one-line configuration of the Android manifest, adding the following meta-data tag to the manifest's <application> tag:

This manifest configuration will be auto generated through our UE4 and Unity integrations.

To provision the proper amount of compute budget for High Frequency Hand Tracking and to prevent the rare risk of overheating devices, there is a necessary downclocking for the CPU and GPU. We noticed overheating was technically also possible with the current, low frequency tracking. To avoid you having to manage thermals at a developer level, we will also be downclocking apps running with low frequency hand tracking (albeit less so). The actual downclocking levels are as follows: low frequency -> CPU level 3 and GPU level 3, high frequency -> CPU level 3 and GPU level 2. We will not be downclocking apps already on the Store, but any update to that app will lead to downclocking. Learn more about CPU/GPU levels here.

Note that this downclocking is already in effect for high frequency tracking if you activate it through the manifest file. This is not the case for low frequency tracking (but you can simulate this impact by setting the CPU/GPU levels yourself).

To illustrate some of the possibilities with these new updates, Tiny Castles and First Steps integrated High Frequency Hand Tracking into their games, along with recent updates from the Hands API such as far- and near-field gesture mechanics and custom hand meshes. For both games, this update provided consistently higher confidence when tracking fast hand movements, alongside expected reductions to latency. Read more about their process in the following blogs:

Adding Hand Tracking to First Steps

Hand Tracking in Tiny Castles

For UE4 developers, we're introducing Hand Pose Recognition Showcase, which demonstrates a relatively simple system that can recognize hand poses (e.g. thumbs up, peace sign) and gestures (e.g. grabbing, flicking through slides) using the raw hand bone information from our High Frequency Hand Tracking system.

This sample can be found in our private UE4 branch on github (https://github.com/Oculus-VR/UnrealEngine) under Samples/Oculus/HandPoseShowcase.

To get access to this private repository, follow the instructions at https://developer.oculus.com/documentation/unreal/unreal-building-ue4-from-source/. The showcase comes complete with art assets, open-source C++ code, and documentation in Samples/Oculus/HandPoseShowcase/HandPoseShowcase.md. Its hand recognition plugin can easily be copied to your own projects, and you'll be exploring your own gesture ideas in no time.

We’re looking forward to hearing your feedback. Please let us know your thoughts in the comments or developer forum.