Meta Quest 3 is officially
here! Delivering
significant improvements compared to Meta Quest 2, Quest 3 is ushering mixed reality into the mainstream. It offers new possibilities for you to enrich your apps with world-blending features and boost performance with optimizations that can help your apps look, feel, and perform better than ever before.
When you build with Meta Quest, you’re tapping into an audience that has spent over $2 billion USD on apps and games in the Meta Quest Store. With Quest 3 available to audiences around the world, now is the time to get started building and updating your apps to grow your business.
With new capabilities and optimizations to explore on Quest 3, you might be asking yourself, “Where do I start?” That’s why we’ve created a brief guide to help you understand where to go to meet your development goals.
If you’re new to building for MR and want to start iterating and integrating with MR capabilities, keep reading below. If you’re building a fully immersive VR experience or you want to boost the performance of your existing apps, jump to the “Punch Up Performance” section.
If You’re New to Developing for Mixed Reality
Building for MR isn’t reserved for seasoned developers or large studios. If you’re just starting out, we recommend reviewing the basics by visiting our
MR Design Guidelines. You’ll gain a better understanding of what MR is, how it can be used, and the core capabilities that power MR on Meta Quest. By referencing the following pages, you can reduce startup time and learn key considerations when designing an MR project from scratch:
Once you’ve learned more about these capabilities and what you can do with them, you can reference technical documentation to find step-by-step instructions on getting started with integration:
We recommend checking out our
samples and showcase apps for best practices for integration of MR features. You can also find samples for Passthrough (
Unity |
Unreal) and Scene (
Unity |
Unreal), as well as tutorials on using Spatial Anchors (
Unity |
Unreal).
Showcases are playable reference apps that give developers best practices for integration of multiple features and inspiration for the application of new features.
Phanto and
Discover (
GitHub |
App Lab) are our latest MR showcases, which you can use to start experimenting with a functional project and translate that experience to your own builds. We have more MR showcases coming soon across engines and use cases, so make sure to bookmark our repo to stay updated on the latest.
Start Integrating Mixed Reality
After you open up your Quest 3 and you’re ready to start developing for MR, the first thing you’ll want to do is make sure you’ve downloaded and installed
SDK v57 (
Unity |
Unreal |
Native) to ensure you have access to the most updated tools and capabilities.
To help streamline setting up and integrating our MR capabilities, we recommend using these tools:
- Download Meta XR Simulator (Unity | Unreal | Native) to unlock rapid Quest development without needing a physical device. Meta XR Simulator is a lightweight XR runtime built for developers that enables the simulation of Meta Quest headsets and features on the API level. It can help make day-to-day development easier by letting you test and debug your apps without the need to put on and take off a headset frequently, and it helps scale automation by simplifying your testing environment setup. Meta XR Simulator also simplifies MR development with the Synthetic Environment Server (SES), which allows MR developers to simulate the physical world via synthetic environments for quick and easy development and testing.
- The Project Setup Tool enables you to quickly configure projects and assess and fix issues that may impact your build. To access the tool, navigate to Edit > Project Settings > Oculus, or Oculus > Tools > Project Setup Tool. Additionally, the status icon on the bottom right corner of the Editor will lead you directly to the tool. The Project Setup Tool is currently available on Unity, with support for Unreal coming soon.
- Download Building Blocks to enable an easier way to integrate Presence Platform capabilities and explore new use cases with MR on Unity. This tool will give you access to a library of feature “blocks” that you can drag and drop into your project, including Passthrough, Hand Tracking, Eye Gaze, and more. Its streamlined configuration can also help you combine features into new or existing projects and worry less about getting our SDKs to work together.
- If you’re a WebXR developer, we recommend using Immersive Web Emulator to test and iterate your WebXR experiences more easily without a headset and Reality Accelerator Toolkit to help integrate MR features.
Soon, you’ll also be able to use the Mixed Reality Development Kit to quickly access a rich set of utilities and tools on top of Scene API that can help you develop faster when building spatially-aware apps. These utilities and tools include scene queries, graphical helpers, a scene debugger, and more.
We’re also working to support developers building in Unreal with new MR templates that can help you save time during MR project setup and enable quick prototyping or experimentation with Passthrough, Scene, and more. Stay tuned for more details on the Mixed Reality Development Kit and MR templates coming soon.
Tools to Unlock Faster Development
You can use the following tools to help expedite the process of testing your MR experiences MR:
- We recommend that all Quest developers download Meta Quest Developer Hub (MQDH), if you haven’t already. MQDH is an essential desktop companion app that accelerates your daily workflow by streamlining frequent tasks like device management, performance analysis, capturing headset display, managing device files, and uploading apps. Whether you’re new to developing with Quest or are an existing fan of MQDH, download MQDH v4.0 or higher for Quest 3 support.
- Set up Meta Quest Link (Unity | Unreal) to unlock faster in-headset iteration on Quest 3 by eliminating the need to deploy your app to a headset every time you test your app during development. Even with our simulation tools’ powerful capabilities, there are some aspects of development, such as testing fast-paced interactions with tracked movements, that are better suited for an in-headset workflow. Quest Link enables you to test your app at full capacity with one click in the Unity and Unreal Editor.
- Above we shared how Meta XR Simulator (Unity | Unreal | Native) can help unlock faster MR development—but it can also help you iterate on your fully immersive VR apps faster by enabling the simulation of Meta Quest headsets and features on the API level.
Developers building in Unity can find more tips on improving iteration time
here.
Punch Up Performance
Performance has a direct impact on user experience—impacting graphics, latency, loading times, frame rate, and battery life. Apps currently running on Quest 2 will automatically experience a big leap in performance on Quest 3 thanks to the new Qualcomm Snapdragon XR2 Gen2 platform and display optics, but that’s not where the optimizations have to end.
With twice the GPU power, a 33% increase in CPU, and over 30% increase in memory compared to Quest 2, Quest 3 gives you the spare headroom to optimize your app in a variety of ways:
- Add pre- and post-processing render passes to generate effects like shadow mapping, tonemapping, SSAO, bloom, and more
- Render at a higher eye buffer resolution
- Increase frames per second (FPS) to provide lower latency and more comfort for your audience
- Adjust shader quality by switching to more expensive shading models like PBR
- Increase the texture resolution of your assets—this will increase GPU time spent during texture sampling but can deliver substantial visual quality improvements
Even relatively small GPU percentage gains can lead to a big payoff. Whether you’re aiming to lower latency time, render crisper visuals, or prevent frame drops, optimizing from your GPU budget is essential for enhancing your app’s user experience.
Here are some recommendations on how you can harness the performance capabilities of Quest 3 and push your app to the limit of what our next-generation headset can offer:
Easy GPU Wins- Enable Dynamic Resolution (Unity | Unreal) to maintain the frame rate of your app while rendering at an optimal resolution. This tool allows you to increase your app’s image quality whenever possible by automatically adjusting the resolution during heavy GPU work. Keep in mind that we recommend turning off Dynamic Resolution when profiling your apps (more on that below).
- For more gains in visual quality, we recommend enabling Meta Quest Super Resolution, a single-pass spatial upscaling and sharpening technique that uses edge- and contrast-aware filtering to preserve and enhance details in the foveal region while minimizing halos and artifacts.
- Target eye buffer resolutions are now higher on Quest 3 with a new default of 1680x1760, which is nearly a 30% leap from Quest 2. Many apps will automatically adjust to the new default, but with even more headroom available, you can use compute budget to target even higher eye buffer resolution.
Steps to Evaluate, Optimize, and Push Performance- The first step in evaluating performance should be to use OVR Metrics Tool (Unity | Unreal | Native) to determine how GPU-bound your app is. This will give you a better idea of how you can use your GPU headroom.
- Profiling is essential for evaluating performance and overcoming GPU and CPU hurdles, and downloading RenderDoc Meta Fork gives you access to low-level GPU profiling data from Quest 3’s Snapdragon XR2 Gen 2 platform that are key for identifying—and fixing—bottlenecks. RenderDoc is also a commonly used graphics debugger, and you can find walkthroughs of several common use cases for optimizing apps with RenderDoc here (Unity | Unreal).
- In order to determine that the GPU is doing what you think it should be doing, it’s important to use the proper tools showing loads, stores, renderpass configurations, and more. ovrgpuprofiler (Unity | Unreal) and GPU Systrace (Unity | Unreal) are low-friction render stage tracing tools specifically designed to display this information.
- If you’re running into performance issues and aren’t sure why, these Basic Optimization Workflows (Unity | Unreal | Native | WebXR) provide simple walkthroughs to help you identify and fix bottlenecks using profiling.
Additional Tools for More Compute- Using Application SpaceWarp can lead to major latency and performance improvements depending on the type of app you’ve built—it gave apps up to 70% additional compute in our initial testing, potentially with little to no perceptible artifacts. AppSW may not be suitable for every app, which is why we strongly recommend referring to our best practices to determine if and how it can be used in your project.
When your app uses capabilities like Passthrough to support MR, you’re going to get 17% lower GPU and 14% lower CPU performance compared to VR-only experiences. If you also add certain Presence Platform features like Depth API, that will consume additional GPU resources too. But even if you’re running a fully-MR experience on Quest 3, your leftover compute budget will still be higher than the entire compute budget available on Quest 2. For more information about performance on Quest 3, watch the Meta Connect session “
State of Compute: Maximizing Performance on Meta Quest.”
Design and Integrate Haptics
Quest 3 controllers come with TruTouch haptics, enabling a new level of immersion and expanding your creative possibilities. To unlock these new capabilities, download
Meta Quest Haptics Studio (
Windows |
Mac). With this suite of state-of-the-art haptics tools, you can quickly and easily design, audition, and integrate high-quality haptics into your apps.
Getting started is easy: Design your haptics from your existing audio effects, instantly feel your haptic creations using the VR companion app, export your effects, and integrate them into your app using
Haptics SDK (
Unity |
Unreal).
Haptics SDK enables you to integrate your haptics into your app, offering a media-like API to trigger and control haptics in the same way you’d control audio. The SDK’s runtime detection system will optimize the haptic signal for the currently connected controller, ensuring your haptic clips are both backward- and forward-compatible across Meta Quest devices.
Experiment with Cutting-Edge Capabilities
We’re excited about Quest 3’s capabilities and the possibilities they unlock for you to engage audiences with MR. We’re just getting started, and if you want to be among the first to start using our newest capabilities, be sure to reference our
experimental technical documentation.
To be among the first to hear news and updates that can benefit how you develop with Meta Quest, follow us on
Facebook and
X.