DEVICE-CENTRIC DOCUMENTATION

The site has a new content architecture. We've added the ability to select your development device to show device-specific content. Please read our blog post Oculus Developer Center Update: Device-centric Documentation Architecture for more information.

Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

VrApi Input API

This document describes using the VrApi Input API.

The VrApi Input API allows applications linked to VrApi to enumerate and query the state of devices connected to a Mobile VR device. When a device is enumerated, its current state can be queried using the Input API.

The Input API is defined in VrApi/Src/VrApi_Input.h. For sample usage, see VrSamples/Native/VrController.

System Input and SIGILL

The mobile VR runtime reserves the Back, Home, Volume Up, and Volume Down buttons for system input. Applications will never see Home and Volume buttons, while Back button presses are deferred and may be consumed in the case of long-presses. In order to capture this input from all input devices, the VR runtime uses a SIGILL interrupt handler to hook input in the VR application process. If your engine uses a SIGILL handler, it may conflict with the mobile VR runtime’s SIGILL handler and cause undefined behavior.

Input Devices Supported

For all devices listed below, there are buttons and interactions that are unavailable or require specific interactions. The Home, Back button long-press, Volume Up, and Volume Down buttons on the controller and headset are reserved for system use and will not appear in the button state on either input device. Please see the Reserved User Interactions page for more information about reserved interactions.

Oculus Touch Controllers

Oculus Quest comes with two 6DoF Oculus Touch controllers. Each Oculus Touch controller has a grip trigger, an index trigger, two face buttons, and a clickable thumb stick. The face buttons, index trigger, and grip trigger are capacitive.

The left controller has a Menu button, and the right controller has a system-reserved Home button. You must insert batteries into both controllers before powering on the headset.

Oculus Go Controller

The Oculus Go controller is an orientation-tracked input device. The controller is positioned relative to the user by using a body model to estimate the location based upon the orientation of the device.

Left-handedness versus right-handedness is specified by users during controller pairing and is used to determine which side of the user’s body to position the controller.

The Oculus Go controller has a touchpad, a trigger, and a Home and Back button.

Bluetooth Gamepads

Bluetooth gamepads are also exposed through the VrApi Input API, attempting to map the device to the classic gamepad model of a X Y B A buttons, left and right triggers, left and right bumpers, a d-pad, and 2 joysticks.

Enumerating Devices

In order to find a device, an application should call vrapi_EnumerateInputDevices. This function takes a pointer to the ovrMobile context and an index and a pointer to an ovrInputCapabilityHeader structure. If a device exists for the specified index, the ovrInputCapabilityHeader’s Type and DeviceID members are set upon return.

Once a device is enumerated, its full capabilities can be queried with vrapi_GetInputDeviceCapabilities. This function also takes a pointer to an ovrInputCapabilityHeader structure, but the caller must pass a structure that is appropriate for the ovrControllerType that was returned by vrapi_EnumerateInputDevices.

For instance, if vrapi_EnumerateInputDevices returns a Type of ovrControllerType_TrackedRemote when passed an index of 0, then the call to vrapi_GetInputDeviceCapabilities should pass a pointer to the Header field inside of a ovrInputTrackedRemoteCapabilities structure. For example:

ovrInputCapabilityHeader capsHeader;
if ( vrapi_EnumerateInputDevices( ovr, 0, &capsHeader ) >= 0 )
{
   if ( capsHeader.Type == ovrControllerType_TrackedRemote )
   {
      ovrInputTrackedRemoteCapabilities remoteCaps;
      remoteCaps.Header = capsHeader;
      if ( vrapi_GetInputDeviceCapabilities( ovr, &remoteCaps.Header ) >= 0 )
      {
            // remote is connected
      }
   }
}

After successful enumeration, the ovrInputCapabilityHeader structure that was passed to vrapi_EnumerateInputDevices will have its DeviceID field set to the device ID of the enumerated controller.

The device state can then be queried by calling vrapi_GetInputTrackingState as described below.

Device Connection and Disconnection

Devices are considered connected once they are enumerated through vrapi_EnumerateInputDevices, and when vrapi_GetInputTrackingState and vrapi_GetCurrentInputState return valid results.

vrapi_EnumerateInputDevices does not do any significant work and may be called each frame to check if a device is present or not.

Querying Device Input State

The state of the input device can be queried via the vrapi_GetCurrentInputState function.

Both functions take device IDs and pointers to ovrInputStateHeader structures. Before calling these functions, fill in the header’s Type field with the type of device that is associated with the passed device ID. Make sure the structure passed to these functions is not just a header, but the appropriate structure for the device type. For instance, when querying a controller, pass an ovrInputTrackedRemoteCapabilities structure with the Header.Type field set to ovrControllerType_TrackedRemote.

ovrInputStateTrackedRemote remoteState;
remoteState.Header.Type = ovrControllerType_TrackedRemote;
if ( vrapi_GetCurrentInputState( ovr, controllerDeviceID, &remoteState.Header ) >= 0 )
{
// act on device state returned in remoteState
}

vrapi_GetCurrentInputState returns the controller’s current button and trackpad state.

Querying Device Tracking State

To query the orientation tracking state of a device, call vrapi_GetInputTrackingState and pass it a predicted pose time. Passing a predicted pose time of 0 will return the most recently sampled pose.

ovrTracking trackingState;
if ( vrapi_GetInputTrackingState( ovr, controllerDeviceID, &trackingState ) >= 0 )

VrApi implements an arm model that uses the controller’s orientation to synthesize a plausible hand position each frame. The tracking state will return this position in the Position field of the predicted tracking state’s HeadPose.Pose member.

Controller handedness may be queried using vrapi_GetInputDeviceCapabilities as described in Enumerating Devices above.

Applications that implement their own arm models are free to ignore this position and calculate a position based on the Orientation field that is returned in the predicted tracking state’s pose.

Re-Centering the Controller

Users may experience some orientation drift in the yaw axis, causing the physical controller’s orientation to go out of alignment with its VR representation.

To synchronize the physical controller’s orientation with the VR representation, users should:

  1. Point the controller in the direction of the forward axis of their headset, and
  2. Press and hold the Home button for one second.

When a re-center occurs, the VrApi arm model is notified and the arm model’s shoulders are repositioned to align to the headset’s forward vector. This is necessary because the shoulders do not automatically rotate with the head.

Applications that implement their own arm models can poll the device input state’s RecenterCount field to determine when the controller is re-centered. RecenterCount increments only when a re-center is performed. We recommend re-centering arm models based on the head pose when this field changes.

Touchpad Swiping Gestures

For touchpads, the user interface of your VR experience should follow these natural scrolling and swiping gestures:

  • Swipe up: Pull content upward. Equivalent to scrolling down.
  • Swipe down: Pull content downward. Equivalent to scrolling up.
  • Swipe left: Pull content left or go to the next item or page.
  • Swipe right: Pull content right or go to the previous item or page.

Hand Tracking

Developer Preview Disclaimer: Hand tracking is presented as a developer preview feature in this release. Apps using this feature will not currently be accepted for submission into any Oculus release channel until the feature exits the developer preview phase in a future release.

Data Usage Disclaimer: Enabling support for hand tracking grants your app access to certain user data, such as the user’s estimated hand size and hand pose data. This data is only permitted to be used for enabling hand tracking within your app and is expressly forbidden for any other purpose.

Applications can enumerate hands as devices if the user has opted into the hand tracking feature and the application adds the following feature flag and permission to its manifest:

<uses-permission android:name="oculus.permission.handtracking" />
<uses-feature android:name="oculus.software.handtracking" android:required="false" />

Applications that do not specify these flags will not see hand devices enumerated.

If an application requires hands and does not work with other input devices, the application must add android:required=”true” in the uses-feature declaration in its manifest. If an application is capable of working with hands or controllers, the application’s manifest should set android:required=”false” in the uses-feature declaration as shown in the example above.

The native Mobile SDK includes a sample application demonstrating the use of the hand tracking API at VrSamples/VrHands.

Hand Tracking Functions

The structure, pose, and physical appearance of the hand is queried by three new VrApi functions.

vrapi_GetHandSkeleton

vrapi_GetHandSkeleton returns the rest pose, bone hierarchy, and capsule collision data for the specified hand. Each bone in the hand may have 0 or more collision capsules attached to it. Applications can use these capsules to perform collision tests using their own collision system.

vrapi_GetHandPose

vrapi_GetHandPose returns the current pose state of the hand, along with some additional state such as confidence and hand scale. Upon return of vrapi_GetHandPose, the ovrHandPose structure passed to this function contains the current rotation of each of the hand bones. Apply these rotations to the corresponding bones in the hand skeleton returned by vrapi_GetHandSkeleton to calculate the pose of the user’s hand at the specified time.

The TrackingStatus field indicates the tracking status of the hand, either ovrHandTrackingStatus_Tracked or ovrHandTrackingStatus_Untracked.

The HandConfidence field is an enumeration that indicates the amount of confidence the tracking system has that the entire hand pose is correct. The confidence value can be either ovrConfidence_LOW or ovrConfidence_HIGH. Applications can use this value to forego rendering hands if the confidence becomes low.

The HandScale field is the scale of the user’s hand, relative to the default hand model scale of 1.0. The HandScale may change at any time and applications should use the value to scale the hand for rendering and interaction simulation at runtime.

The FingerConfidences field is an array of ovrConfidence values, one per finger, indicating the tracking system confidence that the pose for that finger is correct.

vrapi_GetHandMesh

vrapi_GeHandMesh returns the mesh for the specified hand. The mesh is returned as arrays of vertex attributes and vertex indices. These attributes include vertex blend weights.

Developers will normally want to read this data and convert it to a vertex buffer format of their own choosing. The mesh UVs are compatible with the Oculus Avatars SDK hand mesh.

A PNG image showing the UV mapping is included in the VrHands sample as VrHands/assets/HandsTextureMapping.png. A full FBX for each hand (left and right) is included in the VrHands/fbx folder. These FBX files are intended as a starting point for making custom hands meshes or skeletons.

Hand Tracking Structures

All of the hands functions take pointers to structures that require a version to be set in the structure’s Header.Version field before the structure is passed to the API functions. The Version field must be set to one of the enum values of type ovrHandVersion. If the Version field does not match one of the values, the API functions will return ovrError_InvalidParameter and the structure will be unmodified.

It is recommended that developers update to the latest version of the hand assets when updating to new versions of the native SDK. However, the structure versioning allows updating to newer versions of the SDK without necessarily changing an application’s hand model assets, or dealing with bone retargeting, which can be an involved process.

To use the API’s latest model assets, set the Version field of the API structures to the highest version in the ovrHandVersion enum. To continue using a previous version, simply do not update the version field.

Note that older versions of meshes may be deprecated in the future, in which case an application rebuilt with a newer SDK and an older ovrHandVersion version may still receive ovrError_InvalidParameter when calling the hand API functions.

Also note that the mesh structure can be large (on the order of 256KB) and allocating multiple structures (for instance, one for the left hand and one for the right hand) simultaneously on the stack may risk overflowing the application stack. Applications can address this by querying and processing only one hand at a time using the same structure, increasing the application stack size, or dynamically allocating the structure passed to vrapi_GetHandMesh.

Hand Joints

Hand joints are described by the ovrHandBone enumeration, which defines an invalid joint, all skinned joints, and a tip joint for each finger. While the tip joints are not skinned (that is, not bound to any mesh vertices) they provide useful location information for the tips of the fingers.

The following image shows joints in the hand skeleton grouped by prefix.

Hand Joints

Hand State

Basic hand state, the state that is derived from the hand’s pose, is queried using the vrapi_GetCurrentInputState method, just like other input devices. If the queried device is a hand, this function must be passed a pointer to an ovrInputStateHand structure. This structure includes the current pointer pose for the hand, the relative strength of any supported gestures such as pinching, and flags indicating the status of the pointer and some gestures.

In order to normalize interactions across various applications, the hand tracking API exposes a filtered pointer pose and detection for pinch gestures. In future SDKs the set of supported gestures may be expanded.

Simple applications that only require point and click interactions can use the pointer pose to treat hands as a simple pointing device, with the pinch gesture acting as the click action. Using PointerPose and pinches from ovrInputStateHand is a good way to ensure you app conforms to the same interaction models as Oculus system applications.

Note that the hand pointer pose can also be queried by passing a device ID representing one of the tracked hands to vrapi_GetInputTrackingState.

Pinches

Pinches are the basic interaction primitive for UI interactions using hands. A successful pinch of the index finger can be considered the same as a normal click, select, or trigger action for a controller, i.e. that action which activates a button or other control on a UI.

The ovrInputStateHand structure exposes both a “strength” and Boolean status for pinches. If a pinch gesture is detected, one the corresponding ovrInputStateHandStatus_<finger name>Pinching bit for the pinching finger is set in the InputStateStatus field.

The progression of a pinch gesture is indicated by the PinchStrength field. For each finger, the corresponding array index ranges from 0 (not pinching) to 1 (fully pinching with the finger touching the thumb).

While PinchStrength is useful in some cases, such as sizing the pinch ray indicator as a pinch gesture is progressing, actual triggering of a pinch event should be based on the ovrInputStateHandStatus_<finger name>Pinching bit being set in the InputStateStatus field.

PointerPose

Deriving a stable pointing direction from a tracked hand is a non-trivial task involving filtering, gesture detection, and other factors. The hand tracking API provides the PointerPose field on the ovrInputStateHand structure so that pointing interactions can be consistent across applications.

The PointerPose field is an ovrPosef that indicates the starting point and position of the pointing ray in world space. It is recommended that developers use this field to determine the direction the user is pointing for the case of UI interactions.

The pointer pose may or may not be valid at any time, depending on the hand position, tracking status, and other factors. When the pointer pose is valid, the ovrInputStateHandStatus_PointerValid flag will be set in the ovrInputStateHand::InputStateStatus field. If this flag is set, the pointer pose is valid and the ray it describes may be used for UI hit testing. If the flag is not set, then an application should avoid using or rendering the ray for UI hit testing.

System Gestures

If at any time the runtime detects the user is performing a system gesture, the InputStateStatus field of ovrInputStateHand will have the ovrInputStateHandStatus_SystemGestureProcessing bit set. It is recommended that when this bit is set, applications rendering their own hands models should highlight the hands to indicate to the user that a system gesture is in progress.

By checking this bit, an application can suspend its own gesture processing when the user is in the process of performing a system gesture. This allows applications to avoid triggering a gesture-based event when the user is intending a system gesture.

Hand Tracking Limitations

Hand tracking for Oculus Quest is currently a preview feature with some limitations. While these limitations may be reduced or even eliminated over time, they are currently part of the expected behavior. Known limitations include the following:

Occlusion

Tracking may be lost or hand confidence may become low when one hand occludes another. In general, an application should respond to this by fading the hands away.

Noise

Hand tracking can exhibit some noise. This may be affected by lighting and environmental conditions. Developers should take this into consideration when developing algorithms for gesture detection.

Controllers + Hands

Controllers and hands are not currently tracked at the same time. Applications should expect either hands to be tracked or controllers to be tracked, but not both at the same time.

Lighting

Hand tracking has different lighting requirements than Oculus Touch tracking. In some situations this could result in head tracking working while hand tracking does not, or vice versa.