Happy Holidays! This is the last tech update of the 2017 year and what a year it has been. We have some new documentation for your reading pleasure over the holidays, updates for integrations and new information about Rift Core 2.0 and the introduction of Oculus Dash.
New Documentation
Building VR apps can be tough. We've overhauled our Best Practices guide with what we've learned about developing VR apps. This updated guide first introduces the high-level concepts you should keep in mind when designing your VR experience. The guide then dives into some specific ways to implement locomotion, user input, positional tracking, and much more. These best practices are intended to help developers produce content that provides a safe and enjoyable consumer experience on Oculus hardware.
You've been asking for a more detailed look at how to use the Platform SDK features in Unreal Engine. We've published comprehensive updates in the Platform SDK docs with information about how to use every Platform feature. Get started on the Unreal Development Getting Started page.
Unity Integrations
Within Unity, we have deprecated support for the 5.4 and 5.5 release channels, and strongly recommend all developers working in the 5.6 release channel use 5.6.4p2. We've added equirectangular support to VR Compositor Layers (mobile only). And, we've fixed a shader issue within the Unity Sample Framework in the Avatar SDK that resulted in very long import times.
Rift Core 2.0
Rift Core 2.0 introduces substantial changes to Oculus Home and replaces the Universal Menu with Oculus Dash. We plan to roll it out to Rift users with the 1.22 runtime in early 2018.
Adding Dash support to your application will produce a better user experience and we recommend doing so when possible. All the necessary resources are now available to add support and test it before the public roll-out.
Dash re-implements Universal Menu as a VR compositor layer. Have a look at the “Introducing Oculus Dash” video in our Welcome to Rift Core 2.0 blog post to get a sense of how it works.
Beginning with runtime 1.22, when users pause an application, instead of rendering the Universal Menu in an empty room, one of two things will happen:
If the application includes Dash support, the application will pause and the Dash menu UI will be drawn over the paused application.
If the application does not include Dash support, the application will be paused by the runtime and the user will be presented with the Dash menu UI in an empty room, similar to the way the Universal Menu is displayed in earlier runtimes.
When the Dash UI is active, the runtime will render tracked controllers in the scene to interact with the menu. Your application should pause and mute, and hide any tracked controllers it renders in the scene, so there will not be a duplicate pair of hands.
For more information on how to implement Oculus Dash for your application, visit our Unity, Unreal, or Native integration documentation.
Avatar SDK
Platform SDK
Release Notes
Rift
Unity
Unreal
Explore more
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Strategies and Considerations for Designing Incredible MR/VR Experiences
Dive into design and rendering tips from GDC 2025. Hear from the teams behind hit games like Batman: Arkham Shadow and Demeo and learn how you can make your mixed reality and VR games more enjoyable and accessible.
Accessiblity, All, Apps, Avatars, Design, GDC, Games, Hand Tracking, Optimization, Quest, Unity
GDC 2025: Emerging Opportunities in MR/VR on Meta Horizon OS
Discover some of the opportunities we shared at GDC 2025 to accelerate your development process, level up your skillset and expand your reach with Meta Horizon OS.