Use Capsense
Updated: Jul 15, 2024
Capsense provides logical hand poses when using controllers. It uses tracked controller data to provide a standard set of hand animations and poses for supported Oculus controllers. For consistency, we provide the Oculus Home hand and controller visuals. Capsense supports two styles of hand poses.
- Natural hand poses: These are designed to look as if the user is not using a controller and is interacting naturally with their hand.
- Controller hand pose: These are designed to be rendered while also rendering the controller. We provide different shapes depending on the controller type. Currently Capsense supports the Quest 2, Quest 3, and Quest Pro controllers.
- Benefit from best in class logical hand implementation and future improvements instead of investing in a custom implementation.
- Due to limitations in the current plugin implementation, in Unity, if hand tracking is enabled and the hand is not on the controller, the hand poses will use the hand tracking data.
- When using Link on PC, pose data for controllers is unavailable when you’re not actively using them (such as when they’re lying on a table).
- In Unity, due to the object hierarchy on the OVRCameraRig Tracking Space, it is non-trivial to provide the hand data and the controller data simultaneously with the legacy anchors. This has required us to create multiple new anchors on the Tracking Space and to add gating logic on the controller and hands prefabs. The gating logic determines if the prefabs should render.
Prior to v65, handscale was ignored for hand tracking whenever capsense was enabled. To fix this, you need to rebuild your project with Core SDK v65 or higher.
- Supported devices: Quest 2, Quest Pro, Quest 3 and all future devices.
- Unity 2022.3.15f1+ (Unity 6+ is recommended)
- Meta XR Core SDK v62+
- Fully compatible with Wide Motion Mode (WMM).
- Using capsense for hands with body tracking through MSDK will both work simultaneously, but they have a different implementation of converting controller data to hands, so the position and orientation of joints will be slightly different.
To leverage the TrackingSpace
on the OVRCameraRig prefab, you need to add some prefabs.
Open your Unity scene.
Under Hierarchy, add an OVRControllerPrefab as a child of RightControllerInHandAnchor.
Add another OVRControllerPrefab as a child of LeftControllerInHandAnchor.
Add an OVRHandPrefab as a child of RightHandOnControllerAnchor.
Add another OVRHandPrefab as a child of LeftHandOnControllerAnchor.
For each of the four prefabs you just added, under Inspector set Show State to Controller In Hand.
The Show State Property set to Controller in Hand.
The SDK package includes a Unity sample for using this feature. It is titled ControllerDrivenHandPoses.
The OVRManager script on the OVRCameraRig prefab has a new enum selector: Controller Driven Hand Poses Type
. This enum has three options:
- None: Hand poses will only ever be populated with data from the tracked cameras, if hand tracking is active.
- Conforming To Controller: Hands poses generated from controller data will be located around the controller model.
Natural: Hand poses generated with this option will be positioned in a more natural state as if the user was not holding a controller.
The Controller Driven Hand Poses Type property.
The integration provides four new C# script functions to control Capsense.
void SetControllerDrivenHandPoses(bool)
: To set whether the system can provide hand poses using controller data.void SetControllerDrivenHandPosesAreNatural(bool)
: To set the applications request to provide the controller driven hand poses as natural instead of wrapped around the controller.bool IsControllerDrivenHandPosesEnabled()
: To query whether the system can use controller data for hand poses.bool AreControllerDrivenHandPosesNatural()
: To query if the poses supplied from controller data are in a natural form instead of wrapped around a controller.
OVRControllerHelper and OVRHand now have a ShowState
enum. The available options for that are:
- Always: The object will not be automatically disabled based on controller and hand state.
- Controller in Hand or no Hand: This means this object will be disabled if the controller is not in the user’s hand, or if hand tracking is disabled entirely.
- Controller in Hand: This means the object will be disabled if the controller is not currently in a user’s hand.
- Controller Not in Hand: This means the object will be disabled if it’s in a user’s hand. This is used for the detached controller situation, for example, sitting on a desk.
- No Hand: This will disable the rendering of the object if hand tracking is enabled and there is a hand.
OVRControllerPrefab: OVRControllerHelper
- Show State
- Show When Hands Are Powered By Natural Controller Poses: This is a checkbox that controls if the controllers can be rendered even if the hand poses are in the natural state. This is used for the detached controller state.
OVRHandPrefab: OVRHand
New anchors for the Capsense feature
- LeftControllerInHandAnchor and RightControllerInHandAnchor:
These anchors are under their respective LeftHandAnchor and RightHandAnchor parents. These anchors let you render the hand and controller at the same time while the controller is in the user’s hand. To do that, add an OVRControllerPrefab to each anchor, matching the hand and setting Show State to Controller In Hand.
- LeftHandOnControllerAnchor and RightHandOnControllerAnchor:
These anchors are under their respective LeftControllerInHandAnchor and RightControllerInHandAnchor parents. These anchors let you render the hand and controller at the same time while the controller is in the user’s hand. To do that, add an OVRHandPrefab prefab to each anchor, make sure Hand Type matches the hand, and set Show State to Controller In Hand.
How can I confirm Capsense is running on my headset?
In your headset, you should see either hands instead of controllers or hands holding controllers. Also, hand pose data should be provided while the hands are active with the controllers.
Can I evaluate the feature on my headset without changing my code?
No, using Capsense requires some code changes.
Troubleshooting Capsense over Link The following are some common pitfalls for using Capsense over Link specifically.
Is my OVRManager configured correctly?
In your scene, check the OVRManager script on your camera rig and confirm that the “Controller Driven Hands” property is set to “Conforming To Controller.”
Have I selected the right XR Plug-in Provider?
In your Unity Player settings, navigate to XR Plugin Management and confirm that the Oculus plug-in provider is enabled for all relevant platforms (Android to run on Quest devices, your computer’s operating system to run over Link).
Have I enabled all the necessary Link features?
In your Meta Quest Link app on your development computer, navigate to Settings > Beta and ensure the setting for Developer Runtime Features is enabled.
Is my OVRHand Show State set correctly?
On your camera rig’s OVRHandPrefabs (OVRCameraRig > TrackingSpace > Left/RightHandAnchor > OVRHandPrefab), ensure that the “Show State” on the OVRHand script is set to “Controller in Hand.”