The latest updates from the Oculus Software, Integrations, and Docs teams are now live. This month's major updates include the launch of Oculus Rift livestreaming, the introduction of a new Audio Propagation package to our Audio SDK, and more!
Livestreaming is an easy way for users to share their VR experiences with friends and family on Facebook. This feature has been available for Oculus Go and Gear VR applications, and recently rolled out to Rift users via our Public Test Channel. Simply activate the feature within the Oculus Dashboard, and then your app’s users will be able to stream 2D videos and images onto their Facebook page for future consumption, increasing discoverability and reach for your application.
Check out the official blog post for more information and details on how to activate this new feature.
You can now offer a new way for your app users to contact your support team! We've added the option for a support URL to be included within the About page for each of your applications, enabling you to manage customer support from your solution of choice.
Keeping in mind that new applications will need to complete the traditional application process, head over to the application overview page within your developer dashboard and enter your existing support destination. It's that easy.
The latest Audio SDK package introduces an Audio Propagation (beta) feature to our Oculus Audio Spatializer Plugins. This update provides real-time occlusion and reverb simulation based on game geometry. This produces accurate audio propagation through a scene with minimal set up. Simply tag the scene meshes that you want included in the simulation and select the acoustic material for each mesh.
We're excited to launch this new feature, especially as audio is such an important element of a truly immersive experience. To get started, download and enable the latest Oculus Audio Spatializer Plugin for your Oculus development platform of choice.
We now support a single stereoscopic input texture for Equirect and Cylinder layers. The most common stereo texture formats, of left/right and top/bottom eye arrangement are supported, along with output types like 360, 180, or custom type on either axis.
This change affords a seamless solution to stereo video playback as well as a more efficient process alternative for Unity developers. We've also expanded our color scale offering to be toggled per-layer, allowing specific effects like fade-to-black to be controlled on a per-overlay basis.
Matchmaking has been refreshed and improved. Now when you create a matchmaking pool, instead of choosing between Browse, Quickmatch, and Advanced Quickmatch, you can opt for the system to manage a queue of rooms. You are also open to choose whether users, the system, or both can create rooms for the queue. These updates enable pools to be more versatile, giving users even more matchmaking scenarios.
To leverage these changes, simply go to the developer dashboard and create a new matchmaking pool. If you have an existing pool for browse matchmaking, it has been migrated to the new settings
For more information, see the Matchmaking Overview or How to: Implement Simple, Advanced or Browse Matchmaking.
We're excited to talk tech with you outside of this website/inbox. See below for a few of the events we'll be attending in the near future.
Oculus Start is our developer program providing qualifying developers with access, support and savings so you can focus on what's really important - creating inspired VR applications. We are continuing to accept applications and are excited to see what you submit!
If you'd like to join the program, head over to Oculus Start to see if you qualify.
Are you already in the Oculus Start program and attending this year's GDC? Shoot an email to email@example.com if you're planning to attend this year's main event.
- The Oculus Team