Developing extended reality apps for Horizon OS in Unity
Updated: Jan 23, 2025
This page provides a brief overview of the development workflow for Meta Horizon OS extended reality (XR) applications in Unity.
To develop Unity apps for Horizon OS, ensure you have the following:
Note: Some recommended testing and debugging tools have additional requirements.
Developing for Horizon OS in Unity
Meta provides a number of useful Unity packages to help you develop XR apps for Meta Quest headsets. The Meta XR Core SDK, for example, includes a custom XR rig and support for fundamental XR features. Other specialized Meta XR SDKs enable you to integrate different types of user input into your Unity project.
In a standard 3D scene, a single virtual camera captures the
“Game View”. This contrasts with XR scenes, which use a rig that consists of multiple individual virtual cameras and scripts that map headset movements to the view represented to the end user.
In Unity XR development, the XR rig (also called the
“XR Origin” in Unity documentation) serves as the center of tracking space in an XR scene. The XR rig contains the
GameObjects that map controller and headset tracking data from your device to the scene in your app. This tracking data is used to move the scene camera, and it also factors into controlling interactions and animating controller and hand poses.
The Meta XR Core SDK includes the OVRCameraRig prefab, which contains a custom XR rig that replaces Unity’s conventional Main Camera. Other Meta XR SDKs for Unity include similar XR rig prefabs, with some slight differences. For example, the Interaction SDK includes the OVRCameraRigInteraction, which extends the Meta XR Core SDK OVRCameraRig with controller and hand tracking.
Apps developed with Meta XR SDKs can access and handle input from a user’s head, hands, face, and voice using Meta Quest headset and touch controller technology.
Users can interact with XR applications in a number of immersive ways. For example, they can move their head to reveal different areas of their environment. They can reach out with their hands and grab objects, and they can use the buttons on their controllers to perform complex operations in their environment, like locomotion movement.
Interactions dictate how a user’s hand movements and controller actions affect the objects and environment around them.
You can create custom interactions with raw headset and controller input using low-level APIs provided with the Meta XR Core SDK, but we recommend that you leverage prefabricated, customizable interaction components provided by Meta in the
Meta XR Interaction SDK.
Meta Quest headsets track the complex movements of a user’s body and face. Using the
Meta XR Movement SDK, you can leverage the tracking capabilities of the Meta Quest hardware to create immersive and responsive experiences for users.
Your app might require users to enter text. Horizon OS apps for Meta Quest can process text from a virtual keyboard using APIs included with the Meta XR Core SDK. Alternatively, you can use a physical keyboard with tracking provided via the Mixed Reality Utility Kit (MRUK).
In addition to using physical movements and controller input for user input, you can develop apps that enable users to interact with their environment with their voice.
Use the
Meta XR Voice SDK to enhance the XR experience with more natural and flexible ways for people to interact with the app. For example, voice commands can shortcut controller actions with a single phrase, or interactive conversation can make the app more engaging.
Using Meta XR Core SDK, you can access raw controller and headset input and use that input to control objects in your Unity project or to create your own interactions and gestures.
Note: We recommend that you use the customizable interactions shipped with Meta XR Interaction SDK for controller and hand input.
Integrating mixed reality features
The
Mixed Reality Utility Kit (MRUK) includes a set of utilities and tools to perform common operations when building spatially-aware apps for Meta Quest. MRUK makes it easier to program against the physical world and leverage other mixed reality features provided by Meta.
Testing and debugging your app is a critical step in the XR development workflow. Meta provides a number of useful tools and workflows as extensions of the Unity Editor, and as separate applications.
To explore new features, use the in-editor
Building Blocks tool that ships with Meta XR Core SDK.
You can also test out features by dragging and dropping prefabs that are shipped with Meta XR SDKs, or by checking out Meta XR SDK samples.
To test your project during development, use the following tools:
Link, a tool that enables you to stream your app to a headset that is connected to your development machine via USB-C or over Wi-Fi.
Note: Link is currently only supported on Windows. If you are developing on macOS, use the Meta XR Simulator to test your projects during development.
Meta XR Simulator, a tool that simulates the extended reality environment of a headset on your development machine. XR Simulator is available for both Windows and macOS development and has the added benefit of allowing you to quickly view changes to your project without needing to use a physical headset.
To manage your test devices during development, use Meta Quest Developer Hub (MQDH).
MQDH enables you to do the following:
- View device logs and generate Perfetto traces to help with debugging
- Capture screenshots and record videos of what you see in the headset
- Deploy apps directly to your headset from your computer
- Share your VR experience by casting the headset display to the computer
- Download the latest Meta Quest tools and SDKs
- Upload apps to the Meta Quest Developer Dashboard for store distribution
- Disable the proximity sensor and the boundary system for an uninterrupted testing workflow
To learn more about creating XR scenes for Meta Quest in Unity, see the following resources:
To learn how to quickly set up a Unity project for Meta XR development, see
Set Up Unity.