Hand Tracking in Unreal Engine
Updated: Mar 6, 2025
Data Usage Disclaimer: Enabling support for hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.
The hand tracking feature enables the use of hands as an input method on Meta Quest devices. It provides a new sense of presence, enhances social engagement, and can help deliver more natural interactions.
The hand tracking feature also allows you to develop UI elements that can be operated with hands and controllers interchangeably. When you opt to use hands, for near-field interactions, users can use their hands to pinch or poke objects. For far-field interactions, the hand’s pose drives a laser cursor-pointer that behaves like the standard controller cursor. Use the cursor-pointer to highlight, select, click, or write your own app-level event logic.
Note that hand tracking complements Touch controllers, but is not intended to replace controllers in all scenarios, particularly with games or creative tools that require a high degree of precision.
Note
The recommended way to integrate hand tracking for Unreal developers is to use the
Interaction SDK, which provides standardized interactions and gestures. Building custom interactions without the SDK can be a significant challenge and makes it difficult to get approved in the store.
To see examples of the hand tracking integration, check out the following:
Before you start implementing hand tracking in your app, see the
Hand Tracking Design Guidelines for terminology, best practices and interaction models when using hands as an input source in virtual reality.
There are also store guidelines for how hand tracking should be implemented in your app that you should be familiar with:
- VRC.Quest.Input.5: Hands must render in the correct position and orientation, and must animate properly
- VRC.Quest.Input.7: The application must properly respect when input is switched between controllers and hands
- VRC.Quest.Input.8: The system gesture is reserved, and should not trigger any other actions within the application
Hand Tracking Architecture
The following image shows the architecture of the hand tracking implementation for Unreal Engine, and how it input information from hands is routed using the same mechanism that controller input is routed.

As shown in the diagram, device input remains the main source of input data for Unreal Engine. Device input routes hand input through the Unreal Engine input system the same way that controller buttons and sticks are. Pinches and pinch strength are also routed as hand input.
Hand-specific features like the mesh/skeleton and bone rotation are provided through the OculusHandTracking
class which is contained within the Input module. The OculusHandTracking
class provides the Blueprint library as well as access hand specific data like hand scale, pointer pose, bone rotation, and more.
There are two samples that show how to implement hand tracking using Unreal Engine:
Turn On Hand Tracking in Your Project
You can turn on hand tracking in Unreal Engine in the Project Settings, which adds the com.oculus.permission.HAND_TRACKING
entry to the Android manifest for your project.
Go to Edit > Project Settings, go to Plugins and select OculusVR.
Under Hand Tracking Support, choose:
- Controllers: hand tracking will not be enabled for your app
- Controllers and Hands: A user can use hand tracking or controllers in your app
- Hands Only: A user must have hand tracking enabled on their device to use your app
To set high frequency, in Project Settings -> OculusVR -> Hand Tracking Frequency, select High.
The hand tracking integration for Unreal Engine features the following.
In summary, the input model has these additions for hand tracking:
- The Input Module supports input from touch controllers and hand tracking.
- The Input Module relays hand pinches and pinch strength through Unreal Engine’s input event system.
- The module will update and store new hand poses, which can be access through Blueprints or by the
OculusHandComponent
Specifically:
FOculusHandState
struct has been added. Similar to the controller-state structs, this struct provides the current hand state inputs and tracked state
- Pinch inputs are updated with key events and axes for pinches and pinch strength
Register new key names and axes defined for hands in the Unreal Engine input system. These identify fingers of each hand as Thumb, Index, Middle, Ring and Pinky.
Pinches and Pinch strength can be bound to the UE input settings to that their events can be associated with Blueprints
`OculusHandComponent. See
Input Bindings in the next section for how to do this.
- Updating Hand Pose
You can bind to hand tracking inputs like pinches and pinch strength using Unreal Engine’s input system. To create a new input binding with hand tracking:
- Go to Edit > Project Settings Find Engine > Input.
- Under Action Mappings or Axis Mappings, add a new mapping.
- For the mapping key value, search for the Oculus Hand category to bring up the various hand tracking input bindings. The following image shows an example:
The Oculus integration for Unreal Engine offers several resources, including several Blueprints.
Blueprint | Description |
---|
| Returns the name of the bone from the bone Id |
| Returns all bone rotations corresponding to the hand type passed in. |
| Returns which hand is the dominant user hand |
| Return the current hand pointer pose. |
| Returns the hand scale |
| Returns a USkeletalMesh for the specified hand. Use the returned mesh to assign to a USkeletalMeshComponent |
| Returns the tracking confidence of hands. |
| Initializes physics capsules on the runtime hand mesh |
| Returns true if hand tracking is enabled on the device. |
| Returns true of the pointer pose is valid |
The Hand component is the
OculusInput
module for hands. This component is a subclass of
Unreal’s UPoseableMeshComponent
, and must be a child of
UMotionController
, which provides the tracking pose and late-update functionality for hands.
The component handles loading the mesh/skeleton as well as updating the bones. This component also handles setting new materials for the hand, hiding hands when tracking is lost/confidence is low.
Options to update root pose, update root scale, set pointer pose root, and enable physics capsules.
The following image shows and example of these properties, and how to set them in Unreal Engine.
If the user performs the
system gesture, to return to Home or access the menu, the gesture will be surfaced through the
OVRPlugin
as an
ovrpButton_Start
signal, and a status flag, similar to the user pressing the Home key or menu button on a controller. You will not need special menu logic for hands in this case.
The following image shows the pinch gesture as well as the system gesture.

Dominant hand features are surfaced through Hand Status flags of the OVRPlugin. You can access this information by using the Blueprint function
GetDominantHand.