Optimizing for Success - Insights into developing for Meta Quest 3 and What to Do Once You’ve Published
We’re excited to share a recap of yet another day of sessions at GDC. Today we’re sharing insights and best practices to develop for Meta Quest, including some of the latest tools to deliver optimized experiences. We’re also discussing best practices for developers who have published an app on our platform and how to leverage App Lab as a business and marketing tool.
Making the most of the magic of Meta Quest
What's different about making games for a mobile, immersive headset? Well, a lot. Ben Walker shared his guidance from years of helping developers publish experiences on Meta Quest. One important consideration to have, and what makes Meta Quest different from PC or console development, is that our headset is a battery-powered device for virtual and mixed reality. While that may seem obvious, there are a lot of architectural considerations to be made which affect everyone who has decided to make an experience for Meta Quest.
Most of Meta Quest 3’s battery power goes to running the CPU and GPU. We have a system of levels where higher levels equate to higher clock speeds. For most developers, we recommend temporarily reducing your app’s clock speeds if you don’t fully utilize the CPU or GPU. This drives the idea of a zero-sum framework: you have to make tradeoffs in order to run at max capacity. The components or features of your app that heavily rely on CPU or GPU are what we consider ‘expensive’ features. Most notably, one of those features is passthrough.
Passthrough is much more than a video feed from the headset’s cameras. While passthrough definitely has the benefit of reducing simulator sickness, it has the benefit of leveraging hand tracking. The objective is to make the passthrough view of the user’s hand to align with reality. As the GIF shows below, the difference of a couple centimeters of location between the user’s eye and the camera can dramatically affect where their hand appears in the user’s eye (black) versus the camera’s (red).
If these views are not aligned, the passthrough feed breaks your proprioception, which is the unconscious sense of where one’s limbs are. This is important for those designing experiences where you have to catch a thrown ball or quickly point to a small button in-game. Even if you don’t plan to leverage hand tracking, this component of Meta Quest 3 causes your held controllers, as seen in passthrough, to line up with the representation of your controllers in-game.
It’s worth noting that with Meta Quest 3, we add in the passthrough after the entire app has rendered. This is for security, so apps cannot store photos of a user’s environment, as well as latency optimization. Other applications can add to the final frame the user sees. You can see OVR Metrics Tool, our developer HUD, on the right, which displays developer stats like framerate.
Compositor
The system that puts together renders from your app’s submitted frame, the passthrough cameras, and any other apps submitting frames, such as OVR Metrics Tool or the system menu, is called the Compositor. One of the Compositor’s jobs is stacking together the layers submitted from different running processes, which may come in at different frame rates and resolutions. The Compositor’s other job is to render those images to screen.
The images sent to our screens are barrel distorted, and that distortion is canceled out by pincushion distortion in the lenses. This lets us use simpler lenses and it puts more pixels on the center of the screen; a real win-win scenario. Quest 3 has one screen per eye and the screens move with the lenses, so inter-pupillary distance doesn’t affect what we render to screen. This is relevant because when you take a source texture of some text, then you render that onto an in-world menu or dialog box in your scene, and then the compositor distorts that scene, it’s the exact same chain of transforms and re-renders and it applies the same amount of blur.
How do you solve this? There’s really only one way: you have to make the transformation from input texture to in-world render, and the transformation from in-world render to lens-distorted output, into one transformation that only samples once. If you aren’t sure about your app’s text quality, ask whether your text is as easy to read as built-in apps, like our Store. If you’re making a user interface, and you’re going to put more than a sentence of text in that user interface, we recommend you spend the effort to put that user interface on a compositor layer.
Target Frame Rate and Application Spacewarp
We often hear developers struggling with missing target framerates on Quest 3. As we mentioned in the beginning, it’s important to remember (and design with this in mind) that Quest 3 is battery powered and that most of this power goes to running the CPU and GPU.
Some of the results of this architecture is that additional render passes cost more. Many systems in Unity and Unreal – blurring, bloom, HDR, realtime shadows, deferred rendering – are, in fact, additional render passes. And while they’re supported on Quest, the additional cost you’re paying for them means you should put much more thought and care into picking your render passes than Unity and Unreal would suggest.
Some of our rendering-heavy games, like Assassin’s Creed: Nexus, use a system that we offer called Application SpaceWarp. The idea is simple: instead of your experience rendering a frame every couple of milliseconds, Quest synthesizes every other frame, giving you twice as long to generate your frame. Your CPU and GPU could have double the cycles to compute and render something incredible for your user.
There are some downsides to take into account when using Application SpaceWarp. You have to generate motion vectors, which could mean modifying every material your game uses to support rendering motion vectors. It also doesn’t play well with transparency, since in-between frames generated by Application SpaceWarp only support each individual pixel moving in one direction. Also, fast motion causes artifacts. For example, if you’re holding or waving a sword around, and the sword moves very fast, this type of gameplay will expose artifacts with Application SpaceWarp. It will also consume some of your frame time, so expect 30% of your new frame time to be consumed by generating motion vectors.
Key Features to aid Development
While those two features we discussed on Quest 3 add a lot of magic, they’re very dense and require a deep understanding of the physical realities of VR. There are other features that can help with development on Quest 3 as well.
We’ve heard from developers that many games, like anything that uses mixed reality or hand tracking, can only be tested running natively in-headset. We’ve also heard that the amount of time to create an Android build, deploy it to headset, put the headset on, and get to the point where you’re testing a change is too lengthy and slows down iteration time. To solve this, we’ve created Meta XR Simulator, which integrates in Unity or Unreal and Native engines. It sits on top of play-in-editor functionality.
The controllers released with every generation of Quest have different haptics abilities. The Quest 2’s haptics motor only works at a narrow frequency band. The Quest 3’s haptics motor is much nicer and can hit more frequencies and amplitudes. The Quest Pro has the same main haptics motor as the Quest 3 and it has additional haptics motors on the thumbstick and index trigger for localized interaction feedback.
You can play with this information using Haptics Studio. Haptics Studio’s main goal is one-click conversion of audio files into haptics files that play using the best possible set of resources per-controller. It also has a system where it connects with your headset and you can iterate on haptics adjustments live, where you press play on your computer and your controller performs the haptics that you’re working on.
We also have our own version of RenderDoc, a popular tool for debugging how a frame gets rendered on the GPU. This app makes connecting and capturing frames from a Quest easy and it gives you much more insight into how your title is generating frames.
Next Steps
We’ve covered a great deal of optimizations while you’re building, but what now? We hosted a session with Maeva Sponbergs, from Beyond Frames to share some business and marketing best practices that tackle how to leverage App Lab to set yourself up for success.
Figuring out who your audience is can be even more crucial than where your app is available. With so many apps out there on platforms like Steam, Meta Quest Store, or the App Store, standing out gets tougher every day. You need an experience that not only grabs attention but also appeals to a broader audience.
Launch your app with intent
If you're thinking about launching your app, consider starting with App Lab to give your app an Early Access launch. This is an awesome chance to fine-tune your experience and refine your targeting before you go for the big launch.
Dive into App Lab with a clear purpose in mind. This platform isn't just a launchpad; it's a fantastic workshop for honing your game. From fine-tuning features and design elements to building a community that's engaged and giving feedback,App Lab is your sandbox and a great place to start your journey.
And the good part? App Lab offers both open and closed environments for testing. This means you've got the flexibility to decide how and when you want to gather feedback. It's about making the most of this opportunity to interact with your audience and let them have a say in the evolution of your game. What are the discovery challenges on App Lab? The main roadblock you're likely hitting is how to get your game noticed. You'll need strategic planning from the get-go to make sure people can find their way to your game's page. Here is a simple checklist to ensure you’re on the right track:
First, know your intended audience. Are they puzzle lovers seeking their next challenge? Story-seekers looking to dive into another world? Action enthusiasts craving quick, intense matches? Identifying your audience is crucial, as it will inform all the subsequent steps.
Next, find your story. What's the narrative you want to weave through your marketing beats and how can you adapt it to fit your audience? How can you break this story down into engaging, bite-sized content that'll drive your future creations? Whether it's sharing game features and updates, or celebrating holidays, there's always a reason to chat—and you just need the right hook, the right language, and a thoughtful narrative plan
Social media mastery is important. Secure your handles early and drop engaging content regularly. Get to know each platform's quirks and use hashtags and keywords to get on the algorithm's good side. Plan out a content calendar for the next three months to keep your messaging on track and make content capture part of your development process.
Give your audience a voice. Set up a Discord server before you start posting, and be ready to engage.
Strategic approach to creators. Find creators who genuinely love the type of game you are making. Start with smaller or mid-sized influencers and focus on building meaningful relationships to drive feedback.
Now, about feedback: Always look at it critically. A handful of rave reviews is great, but ensure that you’re focused on growing your audience and not just generating positive feedback. Use positive feedback as a stepping stone, not a comfort zone, to assess if what you’re offering matches your audience’s expectations. Expand your reach with the help of your early supporters to validate the sentiment. Feedback is gold only if you're willing to act on it. It's about finding the sweet spot between staying true to your game's core and being open to changes based on what players are telling you.
Embrace the journey and put in the work to discover and engage your audience throughout the development process for a successful launch after having worked so hard to optimize your title for Meta Quest 3.
We had a fantastic day at GDC hearing about Meta Quest 3’s features from Ben Walker and learning how to best engage with your audience from Maeva Sponbergs. We’ve covered our other activations at GDC here on our blog, so make sure you check our other posts for more updates, best practices, and announcements!
GDC
Quest
RenderDoc
Unity
Unreal
Explore more
Growth Insights Series: Effective Expectation Setting in Your Product Details Page
Discover tips to set your game or app up for long term success starting with your Product Details Page in the Meta Horizon Store.
All, App Submission, Apps, Games, Marketing, Quest
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Insights for Creating, Monetizing and Growing on Meta Horizon Worlds
Get insights from GDC on how Meta Horizon Worlds’ unique development path enables you to build quickly, broaden your reach, earn revenue and measure success.
All, Design, GDC, Games, Marketing, Mobile, Multi-User, Optimization, Quest