Oculus Developer Updates from OC3

Oculus Developer Blog
Posted by Oculus VR
October 7, 2016

Yesterday’s Oculus Connect 3 keynote was packed with exciting announcements. While we’ve covered the highlights on the Oculus blog, we wanted to take a deeper dive into some important developer news here.

Oculus makes it easier to develop for UE4
We’ve been close partners with Epic Games since the beginning and have used Unreal Engine 4 on countless projects including the Emmy Award-winning short Henry.

Now, we want to put the power of Unreal in the hands of every Oculus developer. Starting today, Oculus will cover Unreal engine license fees for any UE4 app sold through the Oculus Store up to the first $5 million gross revenue.

We can’t wait to see what you create.

Avatar SDK
We want VR to be the most social platform ever, and we’re enabling social presence at a platform level. Our new Oculus Avatars system lets people customize their own persistent VR identities for more lifelike and memorable interactions.

We’re also releasing an Avatar SDK that gives you an out-of-the-box solution for Touch interactions and presence. Because it brings people’s avatars into your experience, it lets them feel like themselves and easily recognize friends.

The Avatar SDK will be available for Rift at Touch launch and for mobile in early 2017, with integrations for Unity and Unreal as well as a native SDK.

And this is just one way we’re helping you make the most social experiences possible.

Oculus Parties and Rooms
We announced Oculus Parties, which lets you start a voice call with up to eight people from anywhere in VR. This feature is accessible from Home as well as the universal menu for added convenience.

We also announced Oculus Rooms, where people can instantly meet up with friends in VR to hang out, watch movies, or jump into the same app simultaneously using our Coordinated App Launch API.

When you gather people together around the app launcher in Rooms, the Coordinated App Launch API lets you seamlessly move to a new experience together. As a developer, you can integrate with our API and benefit from improved social interactions and shared experiences with your app.

Both Parties and Rooms will ship for Gear VR in the coming weeks, with the Rift launch scheduled for early 2017. The Coordinated App Launch API to integrate will be included in our Platform SDK.

The VR Web
We’re working to accelerate an open ecosystem of VR experiences built on web technology. The VR Web lets you move cross-platform between VR experiences instantaneously—with zero downloads or installs.

We’re working on ReactVR, a new SDK that’ll make it easy for developers to create their own cross-platform immersive experiences. ReactVR is built on React—one of the most popular JavaScript libraries for building web and mobile content.

Next month, we’ll release a developer preview of ReactVR, so you can start building for the VR web.

We’re also making it easy for people to discover, enjoy, and share your VR web content. We’ll release a developer preview of our Oculus VR browser, codenamed “Carmel,” for Oculus devices soon.

This technology will fundamentally change the way the world experiences VR. We’re excited to help you deliver cross-platform, instantly accessible content to the broadest market possible.

Audio SDK
We announced an exciting new feature of the cross-platform Oculus Audio SDK: ambisonic rendering.

Ambisonic rendering allows for a sphere of sound that shifts realistically as you move through the virtual environment. Combined with our existing spatialization techniques, this opens up the possibility for even greater immersion and presence.

Ambisonic rendering is already available as part of the 1.1 Audio SDK release, and support is coming to Gear VR.

We’re really excited to make it available to everyone.

What’s next for Rift and Touch
We’ve shipped six major software releases since the Rift launch, and today we announced that the October update including the 1.9 PC SDK release will ship next week.

Everything you need to develop for Touch is available today, including out-of-the-box integrations for Unity and Unreal—with samples to help you get up and running instantly.

Guardian System and sensor options
The new Guardian System for Touch helps your players stay aware of their play area while in VR.

You can achieve a 360-degree play space with two sensors in a front-to-back configuration, or add a third sensor for our room-scale option. Extra sensors will be available for purchase on Oculus.com at Touch launch.

Good news for developers: once someone sets up their play area with the Guardian System, you can query for its size and customize your experiences to fit the playable space.

Asynchronous Spacewarp
We also announced our latest software advancement—Asynchronous Spacewarp (ASW).

While Asynchronous Timewarp reduces judder, improves efficiency, and delivers consistent low latency, ASW lets your title run at 45hz and achieve an immersive experience that's almost on par with native 90hz rendering. This makes it easier for lower-end machines to power Rift.

With ASW, we’re introducing a minimum spec for Rift. With lower CPU and GPU requirements, people can get into VR at a lower cost with a wider range of hardware. As a developer, you’ll still target 90hz on recommended spec systems, and ASW will let us bring your title to minimum spec systems. We’re thrilled to help bring the world of VR—and your incredible content—to a much wider audience.

Mobile SDK
We’re equally focused on helping our mobile developers produce high-performance, high-fidelity experiences.

Since many third-party performance tools are either hardware-specific or simply not available for all developers, we started pushing forward behind the scenes on our own tool—the Oculus Remote Monitor.

With the latest Mobile SDK release, you can deep dive on CPU zones, OpenGL draw calls, CPU/GPU clock frequencies, and process-wide memory allocations to better track app performance.

We also announced Multiview, a new OpenGL extension that helps address the inefficiency of sequential rendering to both eye buffers by letting you render to multiple elements of a 2D texture simultaneously.

Multiview is already available for native apps on some Gear VR devices, and we’ve seen CPU improvements of up to 50% on internal apps. We’ll roll-out Multiview for Unreal and enter beta for Unity by the end of the year.

We also announced the impending launch of Dynamic Streaming-as-a-service (DSAAS), which lets you enjoy the benefits of Dynamic Streaming without building complicated infrastructure or paying for increased storage and bandwidth.

With our APIs, you upload your video, we slice it into different streams as necessary and serve the content back to your viewers.

Both ambisonic audio and DSAAS will launch in the upcoming months, with original content from Felix & Paul, Jaunt, and Within.

We can’t wait for you to go hands-on with all this new technology, and we look forward to your feedback.

— The Oculus Team