Kickstart Your VR Development with the Latest Showcase Apps for Meta Quest
Since we first launched Presence Platform in 2021, we’ve been dedicated to expanding its advanced suite of capabilities so our developer community can build even more engaging mixed reality, interaction, and voice experiences with realistic facial expressions and body movements.
In just the past few months alone, we’ve been thrilled to offer you new ways to immerse your audience on the Meta Quest Platform with Movement SDK, Shared Spatial Anchors, and Audio SDK. As part of Presence Platform, these capabilities will drive new ways for people to express themselves, interact, and connect with others in VR—and for our developer community, they offer tools to help people feel more present and engaged with your apps.
We can’t wait to see how you use our latest updates to continue pushing the envelope for what’s possible in VR. But we also know that developing a high-quality app takes time, and immediately integrating new features may not be feasible depending on the stage of your project. That’s why we’ve been busy creating new showcase apps that you can use to understand how new capabilities were integrated and become inspired by the impact they can have on your projects both now and in the future. Dive in below to discover hands-on experiences that you can find on App Lab and GitHub with the latest updates from Presence Platform.
Get a Fresh Look at Eye and Face Tracking with Aura
Today we’re excited to announce the release of Aura, a new sample app that can help you get familiar with the technology driving eye and face tracking—without needing to start a new project or develop code. This app shows face and eye tracking retargeted for Aura, the alien character we first shared with you at Meta Connect 2022. When you make face or eye movements, Aura will mirror the movements sensed by Meta Quest Pro’s cameras—enabling you to see how your movements are represented in-headset with our new capabilities.
Aura offers developers some of the same tools that were used to debug and retarget the blendshapes during development. A histogram shows each of the blendshapes being activated, allowing you to see which blendshapes are being detected by Meta Quest Pro’s cameras. This histogram provides good insight into what can be detected during different facial movements—including movements made while speaking.
To achieve more realistic character expressions, we recommend retargeting expressions according to the dimensions of your character. For instance, if your character has a very large mouth, you might need to tune down the detected mouth expressions. This also means that you might want to emphasize or de-emphasize the blendshapes as part of your retargeting process.
To give you a feel for how this works, we allow you to adjust blendshapes in Aura. The app provides you with sliders for each of the blendshapes that allow you to add a multiplier (to emphasize or de-emphasize) and a cap (to limit the effect). Playing around with these sliders can give you a better understanding of the retargeting you will eventually do with your own projects.
Aura is available to download on App Lab. If you feel inspired by Aura, you can further explore the features by downloading the Movement SDK sample on GitHub. Note: While eye tracking and face tracking are only supported on Meta Quest Pro, body tracking is supported on both Meta Quest 2 and Meta Quest Pro, giving you an easy way to have access to a torso without the need for complex inverse kinematics (IK).
Hear What Voice SDK Has to Offer with Whisperer
Voice SDK (available in Unity and Unreal) lets you build voice-driven experiences and gameplay that give people a natural and engaging way to interact with characters and in-app environments. We created the showcase app Whisperer in collaboration with BUCK to demonstrate the immersive power of voice interactions in VR.
Using a unique combination of features like Voice Command, Dictation (Speech-to-Text), and Voice Attention System, Whisperer takes you on a puzzle-based journey where you find yourself reimagined as a ghost who can only interact with the world using your voice. When you raise your hands in front of you, the headset’s microphone automatically activates and Voice SDK will listen to you. You can then speak to various objects and advance through the story, telling them to move, open, turn on, and more. Oh, and don’t forget to ask Harold the macaw for help if you’re stuck!
Not only does this Unity project offer unique gameplay, it also provides best practices for designing an engaging user experience using voice interactions in VR. After trying out the app for yourself on App Lab, you can find the source code on GitHub to deconstruct how specific voice interactions were implemented.
If you’re interested in learning more about Voice SDK or want to optimize the voice interactions in your apps, check out our Voice SDK Best Practices.
Dive into Presence Platform with Slimeball!
Presence Platform capabilities are designed to help you take advantage of features like stereoscopic color Passthrough, face and eye tracking, and more to help people feel present together in social, collaborative, and competitive mixed reality.
To demonstrate how many of these Presence Platform capabilities can be combined to build compelling and unique experiences, we partnered with Hackett and Skillman to create Slimeball!, a competitive, tabletop MR game with an emphasis on expressive characters and natural interactions. Players compete to feed Squelch by shooting slimeballs, items, and other surprises into their mouth for points and bonuses.
You can download Slimeball! on App Lab to get inspired with a hands-on experience that was designed to show you how you can use these capabilities in your projects:
Passthrough: Slimeball! uses Passthrough to demonstrate how mixed reality with full stereoscopic color creates a more engaging experience. With Meta Quest Pro, developers can use stereoscopic color Passthrough to help people feel more present in their apps—and this feeling of presence can help drive retention, engagement, and a better overall user experience.
Hand tracking: Slimeball! was built with hand tracking in mind to create a more engaging and immersive experience. In Slimeball!, players can use their hands to grab and shoot slimeballs and other items at Squelch. Hand tracking with Interaction SDK enables the game to predict the trajectory of objects being thrown and provides a natural interaction between players and their MR environment when picking up and throwing items.
Eye tracking and face tracking: In Slimeball!, players control Squelch directly with the help of eye and face tracking to shoot slimeballs and other items into its mouth. Players can open or adjust their mouth size while shooting, and Squelch will mirror their movements to increase the chance that players’ shots will hit their target.
Spatial Anchors: Slimeball! uses Spatial Anchors to enable an interactive, virtual tabletop game board that people can place over a surface in their physical environment in-headset. The interactive game board can be adjusted by players according to the size of their play surface and remains anchored to that surface throughout the game, giving players more freedom to play and interact in VR within almost any physical environment. You can use Spatial Anchors to build experiences in which players can place digital content over physical objects—and have it persist across sessions.
Note: Slimeball! is supported on Meta Quest Pro and Meta Quest 2 devices. We recommend experiencing Slimeball! on Meta Quest Pro to access all of the capabilities referenced above for a better user experience.
Developing a high-quality VR app requires innovation, passion, and problem solving. Whether you’re ideating on your next great project or want to learn about new ways for your app’s users to interact in VR, downloading showcase apps is a great way to get familiar with our latest capabilities and get inspired about the possibilities for integrating similar features into your own apps. Stay in the loop on our Twitter and Facebook pages to get updates when new showcase apps are released or when existing showcase apps are updated.
Apps
Presence Platform
Quest
Unity
Unreal
Explore more
Growth Insights Series: Effective Expectation Setting in Your Product Details Page
Discover tips to set your game or app up for long term success starting with your Product Details Page in the Meta Horizon Store.
All, App Submission, Apps, Games, Marketing, Quest
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Insights for Creating, Monetizing and Growing on Meta Horizon Worlds
Get insights from GDC on how Meta Horizon Worlds’ unique development path enables you to build quickly, broaden your reach, earn revenue and measure success.
All, Design, GDC, Games, Marketing, Mobile, Multi-User, Optimization, Quest