Hand Tracking in Unreal Engine

Data Usage Disclaimer: Enabling support for hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.

The hand tracking feature enables the use of hands as an input method on Oculus Quest devices. It provides a new sense of presence, enhances social engagement, and can help deliver more natural interactions.

The hand tracking feature also allows you to develop UI elements that can be operated with hands and controllers interchangeably. When you opt to use hands, for near-field interactions, users can use their hands to pinch or poke objects. For far-field interactions, the hand’s pose drives a laser cursor-pointer that behaves like the standard controller cursor. Use the cursor-pointer to highlight, select, click, or write your own app-level event logic.

Note that hand tracking complements Touch controllers, but is not intended to replace controllers in all scenarios, particularly with games or creative tools that require a high degree of precision.

This topic contains the following sections:


If you have not previously implemented input motion controllers in Unreal Engine, see UE’s Motion Controller Component Setup.

Before you start implementing hand tracking in your app, see the Hand Tracking Design Guidelines for terminology, best practices and interaction models when using hands as an input source in virtual reality.

There are also store guidelines for how hand tracking should be implemented in your app that you should be familiar with:

  • VRC.Quest.Input.5: Hands must render in the correct position and orientation, and must animate properly
  • VRC.Quest.Input.6: Hands must be hidden if they are not being tracked or if tracking confidence is low
  • VRC.Quest.Input.7: The application must properly respect when input is switched between controllers and hands
  • VRC.Quest.Input.8: The system gesture is reserved, and should not trigger any other actions within the application

For an overview about how hand tracking is implemented and how it has been used in apps, see the OC6 Video Presentation: Hand Tracking Deep Dive: Technology, Design, and Experiences.

Hand Tracking Architecture

The following image shows the architecture of the Oculus hand tracking implementation for UE4, and how it input information from hands is routed using the same mechanism that controller input is routed.

As shown in the diagram, Oculus Input remains the main source of input data for UE4. Oculus Input routes hand input through the UE4 input system the same way that controller buttons and sticks are. Pinches and pinch strength are also routed as hand input.

Hand-specific features like the mesh/skeleton and bone rotation are provided through the OculusHandTracking class which is contained within the Oculus Input module. The OculusHandTracking class provides the blueprint library as well as access hand specific data like hand scale, pointer pose, bone rotation, and more.

Sample Implementations

There are two samples that show how to implement hand tracking using Unreal Engine:

Turn On Hand Tracking in Your Project

You can turn on hand tracking in Unreal Engine in the Project Settings, which adds the com.oculus.permission.HAND_TRACKING entry to the Android manifest for your project.

Go to Edit > Project Settings, go to Plugins and select OculusVR. Under Hand Tracking Support, choose:

  • Controllers: hand tracking will not be enabled for your app
  • Controllers and Hands: A user can use hand tracking or controllers in your app
  • Hands Only: A user must have hand tracking enabled on their device to use your app

High Frequency Hand Tracking

High frequency hand tracking offers more robust and reliable gesture detection, stable hand depiction, and latency improvements of about 10%. This creates a seamless replication of natural hand movement and more immersive experience for users who are using hands as input.

Before you choose high frequency hand tracking, it is important that you understand that we will reserve some performance headroom from the app’s budget. To provision the proper amount of compute budget for high frequency and to prevent the rare risk of overheating devices, there is a necessary downclocking for the CPU and GPU. To avoid you having to manage thermals, we will also be downclocking apps running with low frequency hand tracking, if needed. The actual downclocking levels for low frequency will be CPU level 3 and GPU level 3 and high frequency will be CPU level 3 and GPU level 2. We are not downclocking apps already available on the Oculus store, but any update to that app will lead to downclocking. One current tradeoff of using high frequency tracking is a slight increase in jitter, particularly under low light conditions. We are aware of this issue and working towards reducing the jitter to previous level. We will announce the fix when available.

To set high frequency, in Project Settings -> OculusVR -> Hand Tracking Frequency, select High.

Integration Details

The Oculus hand tracking integration for Unreal Engine features the following.

Updates to the Oculus Input Module

In summary, the Oculus input model has these additions for hand tracking:

  • The Oculus Input Module supports input from touch controllers and hand tracking.
  • The Oculus Input Module relays hand pinches and pinch strength through UE4’s input event system.
  • The module will update and store new hand poses, which can be access through blueprints or by the OculusHandComponent


  • FOculusHandState struct has been added. Similar to the controller-state structs, this struct provides the current hand state inputs and tracked state

  • Pinch inputs are updated with key events and axes for pinches and pinch strength
    • Register new key names and axes defined for hands in the UE4 input system. These identify fingers of each hand as Thumb, Index, Middle, Ring and Pinky.

    • Pinches and Pinch strength can be bound to the UE input settings to that their events can be associated with blueprints `OculusHandComponent. See Input Bindings in the next section for how to do this.

  • Updating Hand Pose

Input Bindings for Hand Tracking

You can bind to hand tracking inputs like pinches and pinch strength using Unreal Engine’s input system. To create a new input binding with hand tracking:

  • Go to Edit > Project Settings Find Engine > Input.
  • Under Action Mappings or Axis Mappings, add a new mapping.
  • For the mapping key value, search for the Oculus Hand category to bring up the various hand tracking input bindings. The following image shows an example:

hand tracking input binding options

Hand Tracking Blueprints

The Oculus integration for Unreal Engine offers several resources, including several blueprints.

GetBoneNameReturns the name of the bone from the bone Id
GetBoneRotationReturns all bone rotations corresponding to the hand type passed in.
GetDominantHandReturns which hand is the dominant user hand
GetHandPointerPoseReturn the current hand pointer pose.
GetHandScaleReturns the hand scale
GetSkeletalMeshFromTypeReturns a USkeletalMesh for the specified hand. Use the returned mesh to assign to a USkeletalMeshComponent
GetTrackingConfidenceReturns the tracking confidence of hands.
InitializeHandPhysicsInitializes physics capsules on the runtime hand mesh
IsHandTrackingEnabledReturns true if hand tracking is enabled on the device.
IsPointerPoseValidReturns true of the pointer pose is valid

Oculus Hand Component

The Oculus Hand component is the OculusInput module for hands. This component is a subclass of Unreal’s UPoseableMeshComponent, and must be a child of UMotionController, which provides the tracking pose and late-update functionality for hands.

The component handles loading the mesh/skeleton as well as updating the bones. This component also handles setting new materials for the hand, hiding hands when tracking is lost/confidence is low. Options to update root pose, update root scale, set pointer pose root, and enable physics capsules.

The following image shows and example of these properties, and how to set them in UE4.

hand properties

Handling System Gestures

If the user performs the system gesture, to return to Oculus Home or access the menu, the gesture will be surfaced through the OVRPlugin as an ovrpButton_Start signal, and a status flag, similar to the user pressing the Oculus home key or menu button on a controller. You will not need special menu logic for hands in this case.

The following image shows the pinch gesture as well as the system gesture.

Dominant Hand

Dominant hand features are surfaced through Hand Status flags of the OVRPlugin. You can access this information by using the blueprint function GetDominantHand.