Asynchronous Spacewarp
Oculus Developer Blog
Posted by
Dean Beeler, Ed Hutchins, and Paul Pedriana
November 10, 2016

TL;DR:Oculus is releasing a new technology aimed at reducing system hardware requirements while maintaining content quality across a wider array of hardware. Asynchronous Spacewarp (ASW) is a frame-rate smoothing technique that almost halves the CPU/GPU time required to produce nearly the same output from the same content. Like Asynchronous Timewarp (ATW), ASW is automatic and enabled without any additional effort from developers.

VR Is Demanding
Rendering convincing, life-like virtual reality environments at a rate fast enough to create a sense of presence is demanding. VR hardware needs to be compact, power efficient, but at the same time very capable. VR software demands near real-time latency from every aspect of the system and the application. Developers need to remain mindful of all these elements to produce an experience that's immersive and enjoyable. Given all this, it's amazing that we've come as far as we have in such a short time.

At Oculus, we've introduced several technologies such as Direct Mode and Asynchronous Timewarp that automatically help developers minimize latency and improve the experience under disparate software and hardware configurations. Today, we are releasing a new technology which automatically scales existing content to more affordable hardware.

Let's Do the Timewarp Again
Early DK2 prototypes first demonstrated basic Timewarp (TW). This was an orientation-only, head-rotation limited reprojection of frames to the lowest possible latency. It was synchronous with rendering but demonstrated the effectiveness of reducing latency by using head movement to predict where images would visibly land.

Asynchronous Timewarp was introduced at Rift launch and provided an automatic, asynchronous version of time warp. It works by intervening every frame to time warp the last submitted eye buffers, even if the application is taking too long rendering the next frame. More about ATW can be found in these two blog posts:

ASW builds on top of the virtual reality smoothing experience of ATW. ATW ensures that the experience tracks the user's head rotation. This means an image is always displayed in the correct location within the headset. Without ATW, when a VR application misses a frame, the whole world drags—much like slow-motion video playback. Encountering this while in VR is extremely jarring and generally breaks presence. ASW goes beyond this and tracks animation and movement within the scene to smooth over the whole experience.

What Is this Sorcery?
ASW generates extrapolated frames from previous frames generated by the VR application. On the surface, this sounds quite a bit like ATW, which is capable of extrapolating for only a single thing: the user's head rotation. While this covers many of the hitches, it's not a magic bullet. ASW works in tandem with ATW to cover all visual motion within the virtual reality experience. This includes character movement, camera movement, Touch controller movement, and the player's own positional movement. If the application falls behind the display's frame rate, the experience typically remains smooth and enjoyable.

An example of this extrapolation is shown below. We have a scene in which a held gun which is moving from right to left in frames 0 and 1 (generated 1/45th of a second apart), and we want to generate an extrapolated frame from that movement (1/90th of a second after frame 1). We detect the movement of the gun and generate a new frame which we display on behalf the application.

When it comes to virtual reality, ATW and ASW are siblings and complement each other. Timewarp is great for accommodating head rotation. In fact, it's perfect for static images at a distance. For applications like 360 videos and features on the horizon, Spacewarp is unnecessary. Conversely, Spacewarp is pretty good for animated objects up close but not so great at tracking head rotation.

With ASW, Oculus is building on ATW to produce the best virtual reality experience possible. Effective latency is kept low and head tracking as before is smooth—and now, moving elements within VR are also kept smooth.

What's the Catch?
Just as with ATW, ASW is active and enabled for all applications without any developer effort.

There's no completely free lunch, however. ASW doesn't scale well below half the display's refresh rate. Depending on what's being displayed, there may be visual artifacts present as a result of imperfect extrapolation. Typical examples include:

  1. Rapid brightness changes. Lightning, swinging lights, fades, strobe lights, and other rapid brightness changes are hard to track for ASW. These portions of the scene may waver as ASW attempts to find recognizable blocks. Some kinds of animating translucency can cause a similar effect.
  2. Object disocclusion trails. Disocclusion is a graphics term for an object moving out of the way of another. As an object moves, ASW needs something to fill the space the object leaves behind. ASW doesn't know what's going to be there, so the world behind will be stretched to fill the void. Since things don't typically move very far on the display between 45 fps frames, these trails are generally minimal and hard to spot. As an example, if you look closely at the extrapolated image from the screenshots here you'll see a tiny bit of warping to the right of the revolver.
  3. Repeated patterns with rapid movement of them. An example might be running alongside an iron gate and looking at it. Since parts of the object look similar to others, it may be hard to tell which one moved where. With ASW, these mispredictions should be minimal but occasionally noticeable.
  4. Head-locked elements move too fast to track. Some applications use a head-locked cockpit, HUD, or menu. When applications attempt to do this on their own without the help of a head-locked layer, the result can be judder because the background is moving fast against the head-locked object. Some accommodation can be made with ASW, but users can move their head fast enough that they'll no longer track properly and the result won't be smooth. Using the head-locked layers (ovrLayerFlag_HeadLocked) provided by the Oculus Rift SDK will produce the ideal result.

Outside of point 4, you shouldn't avoid scenarios that produce these artifacts but rather be mindful of their appearance. ASW works well under most, but not all, circumstances to cover sub-90fps rendering. We feel the experience of ASW is a significant improvement to judder and is largely indistinguishable from full-rate rendering in most circumstances.

As a Developer, What Can I Do to Make ASW Work Best?
ASW enables a class of computers that were previously unable to drive VR. This means on recommended specification systems ASW should rarely be seen, if at all. Developers should maintain 90 fps rendering on recommended spec systems. Without any additional effort from the developer, the experience generally will scale to minimum spec machines and use ASW as needed. This is because apps that run at 90 fps on recommended spec systems can typically run at 45 fps on min spec systems. ASW and VR in general will benefit from the following suggestions:

  1. Use layers for head-locked content, HUDs, and menus. These items are tracked more accurately using layers in the Oculus runtime compositor. Using layers correctly will make these elements appear crisp and track correctly even if ASW is unavailable, and also allow improved image quality and readability for text.
  2. Never assume the display frame time is 1/90th of a second (or any other constant value). Run application simulations including animations and physics based on elapsed real time, not frame counts. There are a number of frame rates found on current VR headsets. The range today is anywhere from 45 fps to 120 fps. Fixing computations to any expected value will guarantee your application runs at the wrong rate on any other hardware. Timing the application to the frame rate will almost certainly result in bugs and poor experiences.
  3. Provide quality settings that are easy for users to understand. Simple Low/Medium/High quality settings allow users to tweak a preferred quality sweet spot. Esoteric or hard to understand settings will result in users poorly tuning their application settings and having a negative experience with your software.

Do I Need a Supercomputer for This?
The more capable the machine is, the less likely it is to need ASW. ASW and its scene extrapolation is activated only when the application cannot maintain the nominal frame rate of the VR display. This means if you built a top-of-the-line machine, then you may never see ASW in action.

The hardware requirements for ASW are modest. This functionality has been enabled on all current-generation AMD GPUs (RX 400 series) and previous- or current-generation Nvidia GPUs (GTX 900 or 1000 series).

Windows 8 or later is required for ASW, but the latest version of Windows 10 provides the best support. We developed this technology with assistance from Microsoft, AMD and Nvidia—special thanks to everyone there for their support.

Coming Soon to a System Near You
ASW is available with the release of the Oculus 1.10 runtime. It will be enabled across all ranges of hardware and systems that support the feature, and activated for all applications. ASW will automatically engage whenever the application needs extra time for rendering. For developers the Oculus Debug Tool will provide support for controlling ASW for development purposes.