Hand Tracking in Unity

Data Usage Disclaimer: Enabling support for Hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.

The hand tracking feature enables the use of hands as an input method for the Oculus Quest device. It delivers a new sense of presence, enhances social engagement, and delivers more natural interactions with fully tracked hands and articulated fingers. Integrated hands can perform object interactions by using simple hand gestures such as pinch, unpinch, and pinch and hold.

The hand tracking feature lets you operate with hands and controllers interchangeably. When you opt to use hands, the hand’s pose drives a laser cursor-pointer that behaves like the standard controller cursor. You can use the cursor-pointer to highlight, select, click, or write your own app-level event logic.

Hand tracking complements the Touch controllers and is not intended to replace controllers in all scenarios, especially with games or creative tools that require a high degree of precision. By opting-in to hand support, your app also needs to satisfy additional technical requirements specific to hand tracking in order to be accepted on Oculus Store. To submit an app to Oculus Store, the app must support controllers along with hand tracking.

Check out the Hand Tracking Design resources that detail guidelines for using hands in virtual reality, along with the OC6 Video Presentation: Hand Tracking Deep Dive: Technology, Design, and Experiences.

Render Hands in App

Apps render hands in the same manner as any other input device. The following sections help you get started with rendering hands in your app:

Note: We support the use of hand tracking on PC through the Unity editor, when using Oculus Quest + Oculus Link. This functionality is only supported in the Unity editor to help improve iteration time for Oculus Quest developers.

Enable hand tracking and understand Android Manifest permission

OVRManager surfaces different options such as controllers only, controllers and hands, and hands only to enable hand tracking from Unity. For any option that enables hand tracking, Oculus automatically adds <uses-persmission> and <uses-feature> elements in the AndroidManifest.xml file. If the app supports controllers only, it does not add any elements in the manifest file.

  • The <uses-permission> element contains the android:name="com.oculus.permission.HAND_TRACKING" attribute.

    <uses-permission android:name="com.oculus.permission.HAND_TRACKING" />

  • The <uses-feature> element contains two attributes: android:name="oculus.software.handtracking" and android:required with a boolean value. When the app supports controllers and hand, android:required is set to "false", which means that the app prefers to use hands if present, but the app continues to function with controllers in absence of hands. When the app supports hands only, android:required is set to "true".

    <uses-feature android:name="oculus.software.handtracking" android:required="false" />

Tip: There are no manual updates required in the Android Manifest file when you enable hand tracking from Unity.

Set up hand tracking from Unity:

  1. Create a new scene or open an existing one from your project.
  2. From the Oculus/VR/Prefabs folder, drag the OVRCameraRig prefab in the scene. Skip this step if OVRCameraRig already exists in the scene.
  3. From the Hierarchy view, select OVRCameraRig.
  4. From the Inspector view, in OVR Manager, go to the Quest section, and from the Hand Tracking Support list, select Controllers and Hands.
  5. For Unity version 2017.4.x, go to Oculus > Tools > Create store-compatible AndroidManifest.xml. This extra step generates the AndroidManifest.xml file with the currently configured hand tracking permissions. Repeat this step to generate a new AndroidManifest.xml if you change the setting.

Note: The Hands Only option is available for developer experimentation only. To submit an app to Oculus Store, the app must support controllers along with hand tracking.

Enable hand tracking on the device

Users must enable the hand tracking feature on their Oculus Quest to use their hands in the virtual environment.

  1. On Oculus Quest, go to Settings > Device.
  2. Enable Hand Tracking by sliding the toggle button.
  3. Enable Auto Enable Hands or Controllers by sliding the toggle button, if you want to automatically switch between hands and controllers.

Add Hands to a Scene

The OVRHandPrefab prefab implements hands as an input device.

  1. From the Hierarchy view, expand OVRCameraRig > TrackingSpace.
  2. From the Oculus/VR/Prefabs folder, drag the OVRHandPrefab prefab under LeftHandAnchor, which is under TrackingSpace.
  3. Repeat step 2 to drag the OVRHandPrefab prefab under RightHandAnchor.
  4. Under LeftHandAnchor, select OVRHandPrefab to open settings in the Inspector view.
  5. Under OVRHand, OVRSkeleton, and OVRMesh, select the left hand type.
  6. Repeat steps 4 and 5 to set RightHandAnchor to right hand type.

Once OVRHandPrefab is added to each hand anchor and configured with the appropriate handedness setting, you can start using hands as input devices.

Configure OVRHandPrefab

There are several advanced level settings that influence how hands render and interact with objects in app.

The following sections walks through these settings:

Get Skeleton and Mesh Data

OVR Mesh Renderer renders hands by combining data returned by OVR Skeleton and OVR Mesh.

  • Select the OVRHandPrefab prefab to configure the following settings:
    • OVR Skeleton exposes data such as the skeleton bind pose, bone hierarchy, and capsule collider data.

      In the Skeleton Type list, select the hand for which you are retrieving the data. For example, hand left.

    • OVR Mesh handles loading a specified 3D asset from the Oculus runtime and exposing it as a UnityEngine.Mesh. The mesh is configured with attributes such as vertices, uvs, normals, and bone weights.

      In the Mesh Type list, select the hand for which you are retrieving the data. For example, hand left.

    • OVR Mesh Renderer combines the data returned by OVR Skeleton and OVR Mesh to generate the animated 3D model of hands.

      Ensure the OVR Mesh Renderer checkbox is selected.

Update Root Pose

When the OVRHandPrefab is parented to the left or right hand anchors in the OVRCameraRig, leave the Update Root Pose checkbox unchecked so that the hand anchors can correctly position the hands in the tracking space.

If you choose to use OVRHandPrefab independently of OVRCameraRig, select the Update Root Pose checkbox to ensure that not only the fingers and bones, but the actual root of the hand is correctly updated.

Enable Hand Model Root Scale

You can get an estimation of the user’s hand size via uniform scale against the reference hand model. By default, the reference hand model is scaled to 100% (1.0). By enabling scaling, the hand model size is scaled either up or down based on the user’s actual hand size. The hand scale may change at any time and we recommend that you should scale the hand for rendering and interaction at runtime.

  1. In the Hierarchy window, select the OVRHandPrefab prefab.
  2. In the Inspector window, under OVR Skeleton (Script), select the Update Root Scale checkbox.

Note: To use the default reference hand size, leave the checkbox unchecked.

Add Physics Capsules

Add physics capsules that represent the volume of the bones in the hand. Use this feature to trigger interactions with physical objects and generate collision events with other rigid bodies in the physics system.

  1. In the Hierarchy window, select the OVRHandPrefab prefab.
  2. In the Inspector window, under OVR Skeleton (Script), select the Enable Physics Capsules checkbox.

Enable Debug Wireframe Skeleton Rendering

You can render the bones of the hand with wireframe lines to assist with visual debugging. Use this feature to debug the hand visually during development.

  1. In the Hierarchy window, select the OVRHandPrefab prefab.
  2. In the Inspector window, select the OVR Skeleton Renderer checkbox.

Customize Skin Material

To change the visual appearance of the default hand model, provide a customized skin material. For more information on how to create a new material, go to Creating and using materials in the Unity developer guide.

  1. In the Hierarchy window, select the OVRHandPrefab prefab.
  2. In the Inspector window, ensure the Skinned Mesh Renderer checkbox is selected.
  3. Under Materials, in Element 0, drag the material you want to use.

Use customized mesh

Use your own customized mesh to render hands of your choice. This is performed by mapping your custom skeleton that is driven by our skeleton. For more information on sample usage, refer to the HandTest_Custom scene, which uses the new OVRCustomHandPrefab_L and OVRCustomHandPrefab_R prefabs, as well as the new OVRCustomSkeleton.cs script.

Integrate Hands in App

Once you’ve configured OVRHandPrefab settings, hands start rendering in your app. This section walks you through several scripts and APIs that help you build an immersive and interactive app by integrating hands in app:

Understand Bone IDs

OVRSkeleton.cs contains a full list of bone IDs, which uniquely identify each bone of the skeleton. Use bone IDs to devise app-level interaction, such as detecting gestures, calculating gesture confidence, targeting a particular bone, or triggering collision events in the physics system.

Invalid          = -1
Hand_Start       = 0
Hand_WristRoot   = Hand_Start + 0 // root frame of the hand, where the wrist is located
Hand_ForearmStub = Hand_Start + 1 // frame for user's forearm
Hand_Thumb0      = Hand_Start + 2 // thumb trapezium bone
Hand_Thumb1      = Hand_Start + 3 // thumb metacarpal bone
Hand_Thumb2      = Hand_Start + 4 // thumb proximal phalange bone
Hand_Thumb3      = Hand_Start + 5 // thumb distal phalange bone
Hand_Index1      = Hand_Start + 6 // index proximal phalange bone
Hand_Index2      = Hand_Start + 7 // index intermediate phalange bone
Hand_Index3      = Hand_Start + 8 // index distal phalange bone
Hand_Middle1     = Hand_Start + 9 // middle proximal phalange bone
Hand_Middle2     = Hand_Start + 10 // middle intermediate phalange bone
Hand_Middle3     = Hand_Start + 11 // middle distal phalange bone
Hand_Ring1       = Hand_Start + 12 // ring proximal phalange bone
Hand_Ring2       = Hand_Start + 13 // ring intermediate phalange bone
Hand_Ring3       = Hand_Start + 14 // ring distal phalange bone
Hand_Pinky0      = Hand_Start + 15 // pinky metacarpal bone
Hand_Pinky1      = Hand_Start + 16 // pinky proximal phalange bone
Hand_Pinky2      = Hand_Start + 17 // pinky intermediate phalange bone
Hand_Pinky3      = Hand_Start + 18 // pinky distal phalange bone
Hand_MaxSkinnable= Hand_Start + 19
// Bone tips are position only. They are not used for skinning but are useful for hit-testing.
// NOTE: Hand_ThumbTip == Hand_MaxSkinnable since the extended tips need to be contiguous
Hand_ThumbTip    = Hand_Start + Hand_MaxSkinnable + 0 // tip of the thumb
Hand_IndexTip    = Hand_Start + Hand_MaxSkinnable + 1 // tip of the index finger
Hand_MiddleTip   = Hand_Start + Hand_MaxSkinnable + 2 // tip of the middle finger
Hand_RingTip     = Hand_Start + Hand_MaxSkinnable + 3 // tip of the ring finger
Hand_PinkyTip    = Hand_Start + Hand_MaxSkinnable + 4 // tip of the pinky
Hand_End         = Hand_Start + Hand_MaxSkinnable + 5
Max              = Hand_End + 0

Retrieve the Current Bone IDs

OVRSkeleton.cs provides methods that retrieve the current skeleton’s start bone ID, the end bone ID, and the number of total bones.

  • Call the GetCurrentStartBoneID() and GetCurrentEndBoneId() methods to retrieve the start and end Bone IDs, which are mainly used to iterate over the subset of bone IDs present in the currently configured skeleton type.
  • Call the GetCurrentNumBones() and GetCurrentNumSkinnableBones() methods to return the total number of bones in the skeleton and the total number of bones that are skinnable. The difference between Bones and SkinnableBones is that Bones also include anchors for the fingertips. However, they are not actually part of the hand skeleton in terms of the mesh or animation, whereas the skinnable bones have the tips filtered out.

Add Interactions

In order to standardize interactions across various apps, the hand tracking API exposes a filtered pointer pose and detection for pinch gestures. Simple apps that only require point and click interactions can use the pointer pose to treat hands as a simple pointing device, with the pinch gesture acting as the click action.

OVRHand.cs provides access to the pointer pose and pinches to ensure your app conforms to the same interaction models as Oculus system applications. The following sections walks you through several functions that you can perform on hands:

Track Hands and Confidence Level

At any point, your app logic may want to check if your app detects hands. OVRHand.cs provides the IsTracked property to verify whether hands are currently visible and otherwise not occluded from being tracked by the device. It also provides the HandConfidence property that indicates the level of confidence the tracking system has for the overall hand pose. The property returns the confidence level as HandConfidence values, either Low or High.

You can use HandConfidence values in your app logic to control hand rendering. For example, forego rendering hands if the confidence level is low. We recommended to only use hand pose data for rendering and interactions when IsTracked is true and HandConfidence is High.

Retrieve Hand Scale

Retrieve the scale of the user’s hand, which is relative to the default hand model scale of 1.0.

  • Call the HandScale property, which returns a floating point value that indicates the current user’s hand scale compared to the reference hand size.

For example, the value of 1.05 would indicate the user’s hand size is 5% larger than the reference hand. The value may change at any time and apps should use the value to scale the hand for rendering and interaction simulation at runtime.

Integrate Pinch

Pinch is the basic interaction primitive for UI interactions using hands. A successful pinch of the index finger can be considered the same as a normal select or trigger action for a controller, i.e., the action that activates a button or other control on a UI.

To detect whether the finger is currently pinching and to check the pinch’s strength, call the GetFingerIsPinching() and GetFingerPinchStrength() methods from OVRHand.cs. Pass the relevant finger constant defined in the HandFinger enum for the finger that you want to query. The finger constants are: Thumb, Index, Middle, Ring, and Pinky.

Example usage

var hand = GetComponent<OVRHand>();
bool isIndexFingerPinching = hand.GetFingerIsPinching(HandFinger.Index);
float ringFingerPinchStrength = hand.GetFingerPinchStrength(HandFinger.Ring);

The progression of a pinch gesture is indicated by the returned float value. For each finger pinch, the corresponding value ranges from 0 to 1, where 0 indicates no pinch and 1 indicates full pinch with the finger touching the thumb.

In addition to the pinch strength, OVRHand.cs also provides the GetFingerConfidence() method to measure the confidence level of the finger pose. It’s measured in terms of low or high, which indicates the amount of confidence that the tracking system has for the finger pose.

  • To retrieve the confidence level of a finger pose, call the GetFingerConfidence() method and pass the finger constant for which you want to track the confidence level.

Example usage

var hand = GetComponent<OVRHand>();
TrackingConfidence confidence = hand.GetFingerConfidence(HandFinger.Index);

Integrate Pointer Pose

Deriving a stable pointing direction from a tracked hand is a non-trivial task involving filtering, gesture detection, and other factors. OVRHand.cs provides a pointer pose so that pointing interactions can be consistent across apps. It indicates the starting point and position of the pointing ray in the tracking space. We recommend that you use PointerPose to determine the direction the user is pointing in the case of UI interactions.

  • Call the PointerPose property from OVRHand.cs.

The pointer pose may or may not be valid, depending on the user’s hand position, tracking status, and other factors. Call the IsPointerPoseValid property, which returns a boolean indicating whether the pointer pose is valid. If the pointer pose is valid, you can use the ray for UI hit testing. Otherwise, you should avoid using it for rendering the ray.

Check System Gestures

The system gesture is a reserved gesture that allows users to transition to the Oculus universal menu. This behavior occurs when users place their dominant hand up with an open palm towards the headset and then pinch with their index finger. When the users uses the non-dominant hand to perform the gesture, it triggers the Button.Start event. You can poll Button.Start to integrate any action for the button press event in your app logic.

To detect the dominant hand, call the IsDominantHand property from OVRHand.cs and to check whether the user is performing a system gesture, call the IsSystemGestureInProgress property from OVRHand.cs. We recommend that if the IsSystemGestureInProgress property returns true, the app should provide visual feedback to the user, such as rendering the hand material with a different color or a highlight to indicate to the user that a system gesture is in progress. The app should also suspend any custom gesture processing when the user is in the process of performing a system gesture. This allows apps to avoid triggering a gesture-based event when the user is intending to transition to the Oculus universal menu.


The following questions help you troubleshoot issues you may encounter during rendering and integrating hands in the app:

  • Why don’t I see hands in my app?

    There can be many reasons why hands are not rendering in your app. To begin with, verify that hand tracking is enabled on the device and that hands are working correctly in the system menus. Check the permissions set in your app’s AndroidManifest.xml file. Ensure that you have used the OVRHandPrefab prefab to add hands in the scene.

  • Why do I see blurry/faded hands?

    Your hands may not be properly tracked since the cameras on the Oculus Quest have a limited field of view. Make sure the hands are closer to the front of the Oculus Quest for better tracking.

  • Can I use another finger besides the index finger for the pinch gesture?

    Yes. Use the OVRHand.GetFingerIsPinching() method from OVRHand.cs with the finger that you want to track instead. For more information about tracking fingers, go to Integrating pinch.

Understanding Hand Tracking Limitations

Hand tracking for Oculus Quest is currently an experimental feature with some limitations. While these limitations may be reduced or even eliminated over time, they are currently part of the expected behavior. For more specific issues, go to the Troubleshooting section.


Tracking may be lost or hand confidence may become low when one hand occludes another. In general, an app should respond to this by fading the hands away.


Hand tracking can exhibit some noise. It may be affected by lighting and environmental conditions. You should take these conditions into consideration when developing algorithms for gesture detection.

Controllers + Hands

Controllers and hands are not currently tracked at the same time. Apps should support either hands or controllers, but not both at the same time.


Hand tracking has different lighting requirements than inside-out (head) tracking. In some situations, this could result in functional differences between head tracking and hand tracking, where one may work while the other has stopped functioning.