Input Data Overview
Updated: Aug 7, 2024
This topic explains the process of how Interaction SDK gets, structures, and modifies tracking data to create interactions. In Interaction SDK, all interactions rely on accurate data about the position and rotation of your headset’s hands and controllers, which the SDK gets directly from your headset’s camera and controllers.
Interaction SDK formats the input data using a set of interfaces (
IController
,
IHand
,
IHmd
, and
IBody
). Each interface handles a certain type of data. For example,
IHand
interprets raw hand data, and
IController
interprets raw controller data. The interfaces define how the raw input data is organized so that it’s understandable by Interaction SDK components and prefabs.
To actually get the raw input data from the OVR (Oculus Virtual Reality) Plugin (your headset), Interaction SDK uses the From OVR ...Source or From Unity XR ...Source components, where “...” is the source of data (for example, “From OVR Body Data Source” or “From OVR Hand Data Source”). Each of the From ...Source components gets a specific type of data and uses one of the three interfaces to organize it. The data from these components is what your interactions use to determine the location of your hands, controllers, or body. You don’t need to manually add these components since they’re already included in their relevant Interaction SDK prefab (for example, the OVRHands prefab contains the From OVR Hand Data Source component and the UnityXRHands prefab contains the From Unity XR Hand Data Source component).
The From ...Source components are:
- In Meta Interaction SDK package:
- From OVR Body Data Source
- From OVR Controller Data Source
- From OVR Controller Hand Data Source
- From OVR Hand Data Source
- From OVR Hmd Data Source
- In Meta Interaction SDK Essentials package if Unity XR Hands is installed:
- From Unity XR Hand Data Source
- From Unity XR Controller Data Source
- From Unity XR Hmd Data Source
Raw input data from the headset going to each of the From ...Source components, which each use one of the three Interaction SDK interfaces.
Once you have input data from a
From ...Source component, it’s usable by an
Interactor. However, hand data can be processed before it’s
routed (sent) to an Interactor. Other types of input data don’t need to be processed. Processing hand data lets you:
- Minimize or remove jitter (shaking).
- Pose the virtual hand’s fingers differently than your physical fingers, such as when poking a button or grabbing a virtual object.
- Offset the origin of a ray interactor based on the position of your virtual hand or controller instead of your physical hand or controller.
To process hand data, you pass it to certain components that use the
IHand interface, like
HandFilter, and then you use that component as the data source for your Interactor instead of the
From ...Source component. You do this regardless of whether you’re using hands or controller driven hands.
Here’s a recommended way to process hand data, taken from the
HandGrabExamples scene. To use this process for interactions other than poke or grab, omit the
SyntheticHand since only poke and grab need to modify the hand joint data in order for the hand visual to look right.
Hand data being processed by multiple components before being rendered.
Here’s how those components process the hand data.
- FromOVRHandDataSource converts data from OVRHand using IHand so Interaction SDK components can understand it.
- If FromUnityXRHandDataSource is used instead, it converts OpenXR data received from Unity XR Hands from OpenXR into Core SDK hand data.
HandFilter smoothes the input hand position data using the provided filter, reducing jitter.
SyntheticHand overrides hand joint data to affect the hand’s pose. It does that to prevent fingers from going through buttons during a poke and to make fingers conform to a pose when grabbing an object.
- HandVisual renders the hand using the processed data.
The difference between hand data not processed by SyntheticHand and data processed by SyntheticHand. During the poke, the hand that uses unprocessed data passes through the button, but the left hand that uses processed data visually limits the poke.
All hand and controller input data eventually goes to an Interactor. Because all hand data components, like the components in the section above, implement IHand, you use any of them to drive various aspects of your application. For example, in a ray interaction, you could use hand data from HandFilter to set the origin of the ray, and you could use SyntheticHand to set the hand visuals.
Routing data to different parts of an application. In the top diagram, the RayInteractor receives hand data to decide where your physical hand is, but the HandVisual receives a filtered version of that data to smooth the movement of the virtual hand. In the bottom diagram, SyntheticHand receives two sets of data. The data directly from Hand tells it where your physical hand is, and the data from HandGrabVisual determines how the virtual hand should look when you grab something.
The OVRInteraction prefab is the base for attaching for hands, controllers, and controller driven hands.
By default, the OVRInteraction prefab contains just the following:
The primary data component for hands in the Interaction SDK is called Hand.
Hand provides pose data, pinch states, pointer pose, and input availability as related to hands.
The FromOVRHandDataSource component can source this data from OVRHand. The FromUnityXRHandDataSource component converts OpenXR data received from Unity XR Hands from OpenXR into Core SDK hand data.
IHand is the primary interface through which Hand data is accessed. Components consuming hand data should prefer to do so through the IHand interface rather than the concrete Hand.
HandRef is a passthrough component for IHand components. All interactor prefabs have HandRef components on their root GameObject that their child components can be wired to.
The primary advantage to doing this is to minimize the amount of scene wiring necessary to connect an interaction prefab. Instead of having to wire the Hand from the OVRInput prefab to each component that needs it in the interaction prefab, only one connection needs to be made to the top level HandRef of the prefab, and all child objects in the prefab will reference that HandRef.
The primary data component for controllers in the Interaction SDK is called Controller.
Controller provides controller pose data, button states, and input availability as related to controllers.
The
FromOVRControllerDataSource component can source this data from
OVRInput. The
FromUnityXRControllerDataSource component sources this data from the
Unity Input System’s OpenXR support.
IController is the primary interface through which Controller data is accessed. Components consuming controller data should prefer to do so through the IController interface rather than the concrete Controller.
Similar to HandRef, ControllerRef is a passthrough component for IController components. All interactor prefabs that wish to reference controllers should have ControllerRef components on their root GameObject, which their child components can be wired to.
Data travels from OVRPlugin up to input data types like Controller and Hand through a number of classes, all of which implement the IDataSource interface through a generic base class: DataSource<TDataType>.
The most important field on the interface is the IDataSource.InputDataAvailable event, which is the way in which components pass updated tracking & pose data to dependencies.
DataSource derived classes are able to provide data of a given type (e.g. HandDataAsset).
DataModifier (which itself derives from DataSource) adds further functionality: it acts as a post-processor on a HandDataAsset. DataModifiers read data from a DataSource, apply changes, cache the results, then offer those results through the IDataSource interface.
The LastKnownGoodHand only passes the last known good hand data down the modifier chain. If the data it’s fed becomes invalid for any reason (tracking lost, low tracking quality), it maintains the last valid hand data.
SyntheticHand can be used to manually modify the joint data of a hand, including the position and rotation of the fingers and wrist. It’s used in the HandGrab scene to adjust how your fingers grab the mug. It’s also used when pushing buttons to visually limit how far the finger can move--this is called poke limiting.
Note
SyntheticHand should be updated after all interaction logic for the frame has completed, typically at the end of the frame.SyntheticHand will gracefully lock and unlock joints by either tweening them into the desired pose or constraining the maximum rotation and spread allowed. Tweaking the provided curves and speeds can allow more/less snappiness.
You can override joints by directly calling into a SyntheticHand. In other contexts, there are components that may look to drive joint locking and unlocking:
Because multiple InteractorVisuals may want to lock or unlock joints at the same time, an
InteractorGroup can ensure that only one InteractorVisual affects a
SyntheticHand at a time.
HandFilter employs a One Euro Filter to smooth both the positional and rotational data of an input stream. The One Euro Filter is a speed-based filter that helps eliminate jitter without increasing lag. The field Filter Parameters is optionally set using a HandFilterParameterBlock asset containing the following values for wrist position, wrist rotation, and finger rotation:
- Beta (0 indicates maximum lag, while 10 is minimal lag).
- Min cutoff (0 indicates maximum filtered jitter at the expense of more lag, while 10 is minimally filtered jitter).
- D cutoff represents the derivative cutoff and can be used to further tune the effect.
Additionally, HandFilterParameterBlock contains a frequency value, measured in frames per second, which is passed to the One Euro Filter. If no value is set for Filter Parameters, the filter will not be employed.
The primary data component for body in the Interaction SDK is called Body.
Body provides pose data, input availability, and the joint mapping as relates to the body skeleton. The
FromOVRBodyDataSource component can source this data from
OVRBody in the
Meta XR Core SDK, which is available individually or as part of the
Meta XR All-in-One SDK.
If using only Unity XR data sources, body data is unavailable.
IBody is the primary interface through which Body data is accessed. Components consuming body data should prefer to do so through the IBody interface rather than the concrete Body.
Different body skeletons can contain different joint sets and parent/child relationships, and the ISkeletonMapping represents these parameters.
When using local pose data for joints, the parent of the joint must be known in order for the data to be useful. The ISkeletonMapping.TryGetParentJointId() method should be used to find the parent of a provided joint.
Because BodyJointId contains a large set of joints that may not be available in the current skeleton data, the ISkeletonMapping.Joints HashSet should be checked for the presence of a given joint before use.
BodySkeletonMapping is a ScriptableObject containing an ISkeletonMapping, that itself exposes the ISkeletonMapping interface.
- For a complete overview of Interaction SDK architecture, see Architecture Overview.
- To learn about Interactors, which are attached to your hands or controllers to initiate interactions, see Interactors.
- To learn about Interactables, which are attached to objects that should respond to Interactors, see Interactables.
- To learn about how Interactors are prioritized when there’s more than one hovering at a time, see InteractorGroup
- To learn about the components of body pose detection, see Body Pose Detection.