The site has a new content architecture. We've added the ability to select your development device to show device-specific content. Please read our blog post Oculus Developer Center Update: Device-centric Documentation Architecture for more information.
All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.
Developers who wish to target multiple platforms and devices may use the Oculus Integration to build and target platforms that support OpenVR. This page will detail the APIs that are supported for cross-platform development and any differences in functionality from typical Oculus development. This page will not detail the typical usage of these APIs and development process that is described elsewhere in the Unity guide.
Cross-platform development allows developers to write an app that, when targeted separately for either the Oculus or SteamVR platforms, will work out-of-the-box with minimal additional work. Cross-platform support for Input generally supports a 6DOF HMD and controllers, like the Oculus Rift S and Touch, the HTC Vive™ and controllers, and the Windows Mixed Reality headset and motion controllers.
Follow the same steps described in the Oculus Utilities for Unity guide. If you’re updating an existing app, you will need to delete the existing camera and drag a new camera prefab into the scene to track the OpenVR HMD + controllers.
When adding OVRCameraRig prefab, you must select the tracking origin ‘FloorLevel’, ‘EyeLevel’ is not supported for cross-platform development.
Controller tracked objects must be made children of either
RControllerAnchor for cross-platform development.
The following OVRDisplay APIs are supported for cross-platform development to retrieve the HMD’s velocity of movement relative to the local tracked space.
Using the OVRInput APIs for Oculus controllers is described in the OVRInput guide. For cross-platform development the following APIs are supported.
Individual Buttons -
GetDown() are supported for the buttons listed below, where
Get() returns the current state of the control (true if pressed),
GetUp() returns if the control was released in the frame (true if released), and
GetDown() returns if the control is pressed in the frame (true if pressed). Mapping for Oculus Controllers are provided on the OVRInput page.
Both the Oculus Touch controllers and the Vive controllers are treated by these APIs as “Touch” to preserve backward compatibility with existing apps. Left XR controller = LTouch, right XR controller = RTouch. Button/control states can be requested for the following -
We encourage developers to specify a Touch controller as the 2nd argument for cross-platform usage, i.e. either LTouch or RTouch, and then have the first argument be a “primary” binding. You are not required to specify a Touch controller as the 2nd argument, but it’s often easier. For example -
Controller Position and Velocity - The following OVRInput APIs are supported for cross-platform development to retrieve the controller’s position in space and velocity of movement relative to the local tracked space.
Button Mapping for Supported OpenVR Controllers
HTC Vive Controller
The OVRInput APIs described above map to the following buttons on the HTC Vive controller -
Microsoft Mixed Reality Motion Controller
The OVRInput APIs described above map to the following buttons on the Microsoft Mixed Reality motion controller, which can be found on Microsoft’s Motion controllers page (see controller image under “Hardware details”).
Cross-platform haptics support is enabled through the SetControllerVibration API. The OVRHpatics and OVRHapticsClip APIs are not supported for cross-platform development.
Usage of the API is described in the OVRInput guide, with cross-platform devices supporting amplitudes of any increment between 0-1, inclusive, and frequencies of 1.0. For example -
OVRBoundary allows developers to retrieve and set information for the user’s play area as described in the OVRBoundary guide. The following APIs are supported for cross-platform development.
Information about adding an overlay using OVROverlay is described in the VR Compositor Layers guide. At this time, only Quad world-locked overlays are supported for cross-platform development.
To use a cross-platform overlay, add a quad gameobject to the scene, delete the mesh renderer and collider components, add an OVROverlay component to the quad, and specify a texture to display.
Oculus Avatars support cross-platform development in Unity. See the Unity CrossPlatform Sample Scene guide for information about using cross-platform Avatars.
Copyrights and Trademarks © Facebook Technologies, LLC. All Rights Reserved.
Oculus and the Oculus Logo are trademarks of Facebook Technologies, LLC. HTC and the HTC Vive logo are the trademarks or registered trademarks in the U.S. and/or other countries of HTC Corporation and its affiliates. All other trademarks are the property of their respective owners.