VrApi Input API (Deprecated)
Updated: Sep 29, 2022
Mobile SDK Deprecation
As of August 31, 2022, Mobile SDK and the VrApi library are no longer supported. Future updates will be delivered through OpenXR extensions and our OpenXR Mobile SDK, not through any new updates to Meta Mobile or PC APIs. - New apps must use OpenXR unless a waiver is granted.
- New apps will not have access to Meta Native Mobile APIs, but existing apps can continue using them.
- No assistance will be provided for creating new apps with Meta Native APIs. You will find recommendations for migrating existing apps to OpenXR in the developer guides.
- Only critical security, privacy, or safety issues in Meta Native APIs will be addressed.
- Any testing of Meta Native Mobile will be restricted to automated QA tests, only to ensure core features remain functional.
This document describes using the VrApi Input API.
The VrApi Input API allows applications linked to VrApi to enumerate and query the state of devices connected to a Mobile VR device. When a device is enumerated, its current state can be queried using the Input API.
The Input API is defined in VrApi/Src/VrApi_Input.h
. For sample usage, see VrSamples/Native/VrController
.
The mobile VR runtime reserves the Home, Volume Up, and Volume Down buttons for system input. Applications will never see Home and Volume buttons. In order to capture this input from all input devices, the VR runtime uses a SIGILL interrupt handler to hook input in the VR application process. If your engine uses a SIGILL handler, it may conflict with the mobile VR runtime’s SIGILL handler and cause undefined behavior.
Controllers and Reserved Interactions
Some buttons on the controller are reserved for system use and will not appear in the button state on either input device. Please see the
Reserved User Interactions page for more information about reserved interactions.
The supported controllers for the selected device are as follows:
Meta Quest Touch Controllers
Meta Quest comes with two 6DOF Meta Quest Touch controllers. Each Meta Quest Touch controller has a grip trigger, an index trigger, two face buttons, and a clickable thumb stick. The face buttons, index trigger, and grip trigger are capacitive.
The left controller has a Menu button, and the right controller has a system-reserved Home button. You must insert batteries into both controllers before powering on the headset.
In order to find a device, an application should call vrapi_EnumerateInputDevices
. This function takes a pointer to the ovrMobile
context and an index and a pointer to an ovrInputCapabilityHeader
structure. If a device exists for the specified index, the ovrInputCapabilityHeader
’s Type
and DeviceID
members are set upon return.
Once a device is enumerated, its full capabilities can be queried with vrapi_GetInputDeviceCapabilities
. This function also takes a pointer to an ovrInputCapabilityHeader
structure, but the caller must pass a structure that is appropriate for the ovrControllerType
that was returned by vrapi_EnumerateInputDevices
.
For instance, if vrapi_EnumerateInputDevices
returns a Type
of ovrControllerType_TrackedRemote
when passed an index of 0, then the call to vrapi_GetInputDeviceCapabilities
should pass a pointer to the Header
field inside of a ovrInputTrackedRemoteCapabilities
structure. For example:
ovrInputCapabilityHeader capsHeader;
if ( vrapi_EnumerateInputDevices( ovr, 0, &capsHeader ) >= 0 )
{
if ( capsHeader.Type == ovrControllerType_TrackedRemote )
{
ovrInputTrackedRemoteCapabilities remoteCaps;
remoteCaps.Header = capsHeader;
if ( vrapi_GetInputDeviceCapabilities( ovr, &remoteCaps.Header ) >= 0 )
{
// remote is connected
}
}
}
After successful enumeration, the ovrInputCapabilityHeader
structure that was passed to vrapi_EnumerateInputDevices
will have its DeviceID
field set to the device ID of the enumerated controller.
The device state can then be queried by calling vrapi_GetInputTrackingState
as described below.-->
Device Connection and Disconnection
Devices are considered connected once they are enumerated through vrapi_EnumerateInputDevices
, and when vrapi_GetInputTrackingState
and vrapi_GetCurrentInputState
return valid results.
vrapi_EnumerateInputDevices
does not do any significant work and may be called each frame to check if a device is present or not.
The state of the input device can be queried via the vrapi_GetCurrentInputState
function.
Both functions take device IDs and pointers to ovrInputStateHeader
structures. Before calling these functions, fill in the header’s ControllerType
field with the type of device that is associated with the passed device ID. Make sure the structure passed to these functions is not just a header, but the appropriate structure for the device type. For instance, when querying a controller, pass an ovrInputTrackedRemoteCapabilities
structure with the Header.ControllerType
field set to ovrControllerType_TrackedRemote
.
ovrInputStateTrackedRemote remoteState;
remoteState.Header.ControllerType = ovrControllerType_TrackedRemote;
if ( vrapi_GetCurrentInputState( ovr, controllerDeviceID, &remoteState.Header ) >= 0 )
{
// act on device state returned in remoteState
}
vrapi_GetCurrentInputState
returns the controller’s current button and trackpad state.
Querying Device Tracking State
To query the orientation tracking state of a device, call vrapi_GetInputTrackingState
and pass it a predicted pose time. Passing a predicted pose time of 0 will return the most recently sampled pose.
ovrTracking trackingState;
if ( vrapi_GetInputTrackingState( ovr, controllerDeviceID, &trackingState ) >= 0 )
VrApi implements an arm model that uses the controller’s orientation to synthesize a plausible hand position each frame. The tracking state will return this position in the Position field of the predicted tracking state’s HeadPose.Pose
member.
Controller handedness may be queried using vrapi_GetInputDeviceCapabilities
as described in Enumerating Devices above.
Applications that implement their own arm models are free to ignore this position and calculate a position based on the Orientation field that is returned in the predicted tracking state’s pose.