Overview
Note
The recommended way to integrate hand tracking for Unity developers is to use the Interaction SDK, which provides standardized interactions and gestures. Building custom interactions without the SDK can be a significant challenge and makes it difficult to get approved in the store.
Data Usage Disclaimer: Enabling support for Hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.
Note: There is a
known issue with the thumb trapezium bone (Thumb0) in the OpenXR backend.
Hand tracking enables the use of hands as an input method for the Meta Quest headsets. Using hands as an input method delivers a new sense of presence, enhances social engagement, and delivers more natural interactions. Hand tracking complements controllers and is not intended to replace controllers in all scenarios, especially with games or creative tools that require a high degree of precision.
We support the use of hand tracking on Windows through the Unity editor, when using Meta Quest headset and Link. This functionality is only supported in the Unity editor to help improve iteration time for Meta Quest developers. Check out the
Hand Tracking Design resources that detail guidelines for using hands in virtual reality.
Note: If you are just getting started with this Meta XR feature, we recommend that you use
Building Blocks, a Unity extension for Meta XR SDKs, to quickly add features to your project.