在此專頁上
All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.
We're no longer accepting submission of 32-bit Oculus Quest apps. Any new or updated Oculus Quest application needs to be 64-bit. Please contact Oculus if you are unable to comply with this policy. Oculus Go and Gear VR apps will not be affected by this change.
This document describes using the VrApi Input API.
The VrApi Input API allows applications linked to VrApi to enumerate and query the state of devices connected to an Oculus mobile VR device. When a device is enumerated, its current state can be queried using the Input API.
The Input API is defined in VrApi/Src/VrApi_Input.h. For sample usage, see VrSamples/Native/VrController.
The mobile VR runtime reserves the Menu, Home, Volume Up, and Volume Down buttons for system input. Applications will never see Home and Volume buttons. In order to capture this input from all input devices, the VR runtime uses a SIGILL interrupt handler to hook input in the VR application process. If your engine uses a SIGILL handler, it may conflict with the mobile VR runtime’s SIGILL handler and cause undefined behavior.
For all devices listed below, there are buttons and interactions that are unavailable or require specific interactions. Please see the Reserved User Interactions page for more information about reserved interactions.
Oculus Touch Controllers
Oculus Quest comes with two 6DoF Oculus Touch Controllers. Each Oculus Touch Controller has a grip trigger, an index trigger, two face buttons, and a clickable thumb stick. The face buttons, index trigger, and grip trigger are capacitive.
The left controller has a Menu button, and the right controller has a system reserved Home button. You must insert batteries into both controllers before powering on the headset.
Bluetooth Gamepads
Bluetooth gamepads are exposed through the Input VrApi, attempting to map the device to the classic gamepad model of a X Y B A buttons, left and right triggers, left and right bumpers, a d-pad, and 2 joysticks.
In order to find a device, an application should call vrapi_EnumerateInputDevices. This function takes a pointer to the ovrMobile context and an index and a pointer to an ovrInputCapabilityHeader structure. If a device exists for the specified index, the ovrInputCapabilityHeader’s Type and DeviceID members are set upon return.
Once a device is enumerated, its full capabilities can be queried with vrapi_GetInputDeviceCapabilities. This function also takes a pointer to an ovrInputCapabilityHeader structure, but the caller must pass a structure that is appropriate for the ovrControllerType that was returned by vrapi_EnumerateInputDevices.
For instance, if vrapi_EnumerateInputDevices returns a Type of ovrControllerType_TrackedRemote when passed an index of 0, then the call to vrapi_GetInputDeviceCapabilities should pass a pointer to the Header field inside of a ovrInputTrackedRemoteCapabilities structure. For example:
ovrInputCapabilityHeader capsHeader;
if ( vrapi_EnumerateInputDevices( ovr, 0, &capsHeader ) >= 0 )
{
if ( capsHeader.Type == ovrControllerType_TrackedRemote )
{
ovrInputTrackedRemoteCapabilities remoteCaps;
remoteCaps.Header = capsHeader;
if ( vrapi_GetInputDeviceCapabilities( ovr, &remoteCaps.Header ) >= 0 )
{
// remote is connected
}
}
}
After successful enumeration, the ovrInputCapabilityHeader structure that was passed to vrapi_EnumerateInputDevices will have its DeviceID field set to the device ID of the enumerated controller.
The device state can then be queried by calling vrapi_GetInputTrackingState as described below.
Devices are considered connected once they are enumerated through vrapi_EnumerateInputDevices, and when vrapi_GetInputTrackingState and vrapi_GetCurrentInputState return valid results.
vrapi_EnumerateInputDevices does not do any significant work and may be called each frame to check if a device is present or not.
The state of the input device can be queried via the vrapi_GetCurrentInputState function.
Both functions take device IDs and pointers to ovrInputStateHeader structures. Before calling these functions, fill in the header’s Type field with the type of device that is associated with the passed device ID. Make sure the structure passed to these functions is not just a header, but the appropriate structure for the device type. For instance, when querying a controller, pass an ovrInputTrackedRemoteCapabilities structure with the Header.Type field set to ovrControllerType_TrackedRemote.
ovrInputStateTrackedRemote remoteState;
remoteState.Header.Type = ovrControllerType_TrackedRemote;
if ( vrapi_GetCurrentInputState( ovr, controllerDeviceID, &remoteState.Header ) >= 0 )
{
// act on device state returned in remoteState
}
vrapi_GetCurrentInputState returns the controller’s current button and trackpad state.
To query the orientation tracking state of a device, call vrapi_GetInputTrackingState and pass it a predicted pose time. Passing a predicted pose time of 0 will return the most recently sampled pose.
ovrTracking trackingState; if ( vrapi_GetInputTrackingState( ovr, controllerDeviceID, &trackingState ) >= 0 )
VrApi implements an arm model that uses the controller’s orientation to synthesize a plausible hand position each frame. The tracking state will return this position in the ‘Position’ field of the predicted tracking state’s HeadPose.Pose member.
Applications that implement their own arm models are free to ignore this position and calculate a position based on the Orientation field that is returned in the predicted tracking state’s pose.
Users may experience some orientation drift in the yaw axis, causing the physical controller’s orientation to go out of alignment with its VR representation.
To synchronize the physical controller’s orientation with the VR representation, users should:
When a recenter occurs, the VrApi arm model is notified and the arm model’s shoulders are repositioned to align to the headset’s forward vector. This is necessary because the shoulders do not automatically rotate with the head.