All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.
Data Usage Disclaimer: Enabling support for hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.
The hand tracking feature enables the use of hands as an input method on Oculus Quest devices. It provides a new sense of presence, enhances social engagement, and can help deliver more natural interactions.
Using simple hand gestures, such as pinch, poke, and pinch and hold, you can integrate hand tracking in your apps so users can select, click, scroll, drag and drop, return, or exit in your app.
Hand tracking complements Touch controllers, but is not intended to replace controllers in all scenarios, particularly with games or creative tools that require a high degree of precision.
For an app to use hands as input devices, it must include the following feature flag and permission in its Android manifest:
<uses-permission android:name="oculus.permission.HAND_TRACKING" /> <uses-feature android:name="oculus.software.handtracking" android:required="false" />
Apps that do not specify these flags will not see hand devices enumerated. Users must also have hand tracking enabled on their devices for hand tracking to work in-app.
If an app requires hands and does not work with other input devices, the app must add
android:required=”true” in the
uses-feature declaration in its manifest. If an app is capable of working with hands or controllers, the manifest should set
android:required=”false” in the
uses-feature declaration as shown in the example above.
The native Mobile SDK includes a
VrHandssample app that demonstrates the use of the hand tracking API.
The structure, pose, and physical appearance of the hand is queried by three new VrApi functions.
vrapi_GetHandSkeleton ( ovrMobile * ovr, const ovrHandedness handedness, ovrHandSkeletonHeader * header )
vrapi_GetHandSkeleton returns the rest pose, bone hierarchy, and capsule collision data for the specified hand. Each bone in the hand may have 0 or more collision capsules attached to it. Apps can use these capsules to perform collision tests using their own collision system.
vrapi_GetHandPose ( ovrMobile * ovr, const ovrDeviceID deviceID, const double absTimeInSeconds, ovrHandPoseHeader * header )
vrapi_GetHandPose returns the current pose state of the hand, along with some additional state such as confidence and hand scale. Upon return of
ovrHandPose structure passed to this function contains the current rotation of each of the hand bones. Apply these rotations to the corresponding bones in the hand skeleton returned by
vrapi_GetHandSkeleton to calculate the pose of the user’s hand at the specified time.
TrackingStatus field indicates the tracking status of the hand, either
HandConfidence field is an enumeration that indicates the amount of confidence the tracking system has that the entire hand pose is correct. The confidence value can be either
ovrConfidence_HIGH. Apps can use this value to forego rendering hands if the confidence becomes low.
HandScale field is the scale of the user’s hand, relative to the default hand model scale of 1.0. The
HandScale may change at any time and apps should use the value to scale the hand for rendering and interaction simulation at runtime.
FingerConfidences field is an array of
ovrConfidence values, one per finger, indicating the tracking system confidence that the pose for that finger is correct.
vrapi_GetHandMesh ( ovrMobile * ovr, const ovrHandedness handedness, ovrHandMeshHeader * header )
vrapi_GeHandMesh returns the mesh for the specified hand. The mesh is returned as arrays of vertex attributes and vertex indices. These attributes include vertex blend weights.
Developers will normally want to read this data and convert it to a vertex buffer format of their own choosing. The mesh UVs are compatible with the Oculus Avatars SDK hand mesh.
A PNG image showing the UV mapping is included in the
VrHands sample as
/VrHands/assets/HandsTextureMapping.png. A full FBX for each hand (left and right) is included in the
/VrHands/fbx folder. These FBX files are intended as a starting point for making custom hands meshes or skeletons.
All of the hands functions take pointers to structures that require a version to be set in the structure’s
Header.Version field before the structure is passed to the API functions. The
Version field must be set to one of the enum values of type
ovrHandVersion. If the
Version field does not match one of the values, the API functions will return
ovrError_InvalidParameter and the structure will be unmodified.
It is recommended that developers update to the latest version of the hand assets when updating to new versions of the native SDK. However, the structure versioning allows updating to newer versions of the SDK without necessarily changing an app’s hand model assets, or dealing with bone retargeting, which can be an involved process.
To use the API’s latest model assets, set the
Version field of the API structures to the highest version in the
ovrHandVersion enum. To continue using a previous version, simply do not update the version field.
Note that older versions of meshes may be deprecated in the future, in which case an app rebuilt with a newer SDK and an older
ovrHandVersion version may still receive
ovrError_InvalidParameter when calling the hand API functions.
Also note that the mesh structure can be large (on the order of 256KB) and allocating multiple structures (for instance, one for the left hand and one for the right hand) simultaneously on the stack may risk overflowing the app stack. Apps can address this by querying and processing only one hand at a time using the same structure, increasing the app stack size, or dynamically allocating the structure passed to
Hand joints are described by the
ovrHandBone enumeration, which defines an invalid joint, all skinned joints, and a tip joint for each finger. While the tip joints are not skinned (that is, not bound to any mesh vertices) they provide useful location information for the tips of the fingers.
The following image shows joints in the hand skeleton grouped by prefix.
Basic hand state, the state that is derived from the hand’s pose, is queried using the
vrapi_GetCurrentInputState method, like other input devices. If the queried device is a hand, this function must be passed a pointer to an
ovrInputStateHand structure. This structure includes the current pointer pose for the hand, the relative strength of any supported gestures like pinching, and flags indicating the status of the pointer and some gestures.
In order to normalize interactions across various apps, the hand tracking API exposes a filtered pointer pose and detection for pinch gestures. In future SDKs the set of supported gestures may be expanded.
Simple apps that only require point and click interactions can use the pointer pose to treat hands as a simple pointing device, with the pinch gesture acting as the click action. Using
PointerPose and pinches from
ovrInputStateHand is a good way to ensure you app conforms to the same interaction models as Oculus system apps.
Note that the hand pointer pose can also be queried by passing a device ID representing one of the tracked hands to
Pinches are the basic interaction primitive for UI interactions using hands. A successful pinch of the index finger can be considered the same as a normal click, select, or trigger action for a controller. It is an action that activates a button or other control on a UI.
ovrInputStateHand structure exposes both a “strength” and Boolean status for pinches. If a pinch gesture is detected, one the corresponding
ovrInputStateHandStatus_<finger name>Pinching bit for the pinching finger is set in the
The progression of a pinch gesture is indicated by the
PinchStrength field. For each finger, the corresponding array index ranges from 0 (not pinching) to 1 (fully pinching with the finger touching the thumb).
PinchStrength is useful in some cases, such as sizing the pinch ray indicator as a pinch gesture is progressing, actual triggering of a pinch event should be based on the
ovrInputStateHandStatus_<finger name>Pinching bit being set in the
Deriving a stable pointing direction from a tracked hand is a non-trivial task involving filtering, gesture detection, and other factors. The hand tracking API provides the
PointerPose field on the
ovrInputStateHand structure so that pointing interactions can be consistent across apps.
PointerPose field is an
ovrPosef that indicates the starting point and position of the pointing ray in world space. It is recommended that developers use this field to determine the direction the user is pointing for the case of UI interactions.
The pointer pose may or may not be valid at any time, depending on the hand position, tracking status, and other factors. When the pointer pose is valid, the
ovrInputStateHandStatus_PointerValid flag will be set in the
ovrInputStateHand::InputStateStatus field. If this flag is set, the pointer pose is valid and the ray it describes may be used for UI hit testing. If the flag is not set, then an app should avoid using or rendering the ray for UI hit testing.
ovrInputStateHandStatus_DominantHand can be set to specify a user’s dominant hand. When set, pinch gestures retain their normal behavior on both hands, but the system gesture results in different
ovrInputStateHandStatus values for the non-dominant hand.
If the runtime detects the user is performing a system gesture on either hand, the
InputStateStatus field of
ovrInputStateHand will have the
ovrInputStateHandStatus_SystemGestureProcessing bit set. If it is the non-dominant hand performing the gesture, the
InputStateStatus field of
ovrInputStateHand will have the
ovrInputStateHandStatus_MenuPressed bit set for a single frame after the event to signal a menu button press.
ovrInputStateHandStatus_SystemGestureProcessing bit is set, it’s recommended that apps rendering their own hand models highlight the hand to indicate that a system gesture is in progress. By checking this bit, an app can suspend its own gesture processing when the user is in the process of performing a system gesture. This allows apps to avoid triggering a gesture-based event when the user is intending a system gesture.
Hand tracking for Oculus Quest is currently a preview feature with some limitations. While these limitations may be reduced or even eliminated over time, they are currently part of the expected behavior. Known limitations include the following:
Tracking may be lost or hand confidence may become low when one hand occludes another. In general, an app should respond to this by fading the hands away.
Hand tracking can exhibit some noise. This may be affected by lighting and environmental conditions. Developers should take this into consideration when developing algorithms for gesture detection.
Controllers + Hands
Controllers and hands are not currently tracked at the same time. Apps should expect either hands to be tracked or controllers to be tracked, but not both at the same time.
Hand tracking has different lighting requirements than Oculus Touch tracking. In some situations this could result in head tracking working while hand tracking does not, or vice versa.