Sensory Reinforcement

Discomfort is often triggered by inconsistencies between sensory inputs such as vision, proprioception, and vestibular systems. The techniques outlined in this guide are helpful in reducing inconsistencies and mismatches between senses, and for making the user experience as comfortable as possible.

Consistent Frame Rate and Head Tracking

A consistent frame rate makes it possible for the camera perspective to reliably match the player’s physical pose. Seeing the view update reliably and with low latency when looking in any given direction is absolutely necessary for an application to be comfortable. If the frame rate is not consistent, the player will experience judder. This is when the virtual camera position doesn’t match the physical camera position. In practice, this means a previously rendered frame that was valid at some time in the past continues to be visible to the player, despite the continued physical movement of the HMD. Judder is uncomfortable to experience, and should be prevented as much as possible.

Because maintaining a solid framerate is such an important factor when maximizing comfort in VR, the Oculus system services provides a reprojection feature called Asynchronous Time Warp (ATW) which reduces the effect of judder when the application doesn’t submit frames fast enough to keep up with the display refresh rate. This technique displays the previously submitted frame and redraws it on the current frame, so that it tracks the current head position. Reprojection is not nearly as effective as maintaining consistent framerate in the first place, so it is important to optimize the application until your app runs at framerate.

Independent Visual Backgrounds

An independent visual background (IVB) can reduce discomfort by helping the brain reinterpret the visual information it is receiving from VR, and in some cases by reducing the amount of visible optic flow. Usually, when your brain sees as much coherent, correlated motion in your visual field as it does while using virtual locomotion, it is because you are the one moving through a stable world. We rarely, if ever, face situations in the real world where we are stable, and large portions of our surroundings are moving around us. However, if you can give your brain sufficient visual evidence that this is the case, you will no longer perceive yourself as moving. Consequently, your vestibular and visual senses will better agree, which in turn will reduce discomfort.

As one might expect, it is no straightforward task to convince your brain that the visual motion of virtual locomotion is the result of the world moving around you rather than you moving through the world. An independent visual background puts geometry or imagery in your environment that is consistent with what your vestibular organs are sensing. A simple example would be a visually rich skybox that, instead of responding to thumbstick input, only responds to head movement. Say, you put your users facing north in a grassy field with a cloud-filled sky an infinite distance away. If they use a virtual locomotion to walk around the field, no matter what thumbstick input they use, they would be looking at the northern sky. However, if they turn their head to the left and right, they would see the western and eastern skies, respectively.

In essence, the IVB is like putting the user in a giant stable room. They can physically look around and move through the IVB if they turn their heads and move their bodies. When they see some visual motion as a result of virtual locomotion, the user’s brain can then plausibly interpret that motion as occurring around them while they are staying still in their stable IVB.

In the past several years of VR, there have been a few notable examples of IVBs. In the game Vanguard V, the user is always flying forward through space. The skybox shows an outer space environment such as star fields and planets that are far enough away that you would not see any motion parallax from the movements you and your avatar make. Any perception of movement through space is communicated through the way relatively nearby objects pass by and visual cues like motion lines.

Brass Tactics also uses an IVB to help make moving through a virtual miniature battlefield more comfortable. The game takes place in a castle war hall where battles unfold on a giant table in front of the player. Although the player can grab the tabletop and move it around to traverse the battlefield, the castle environment behaves like a normal room. This helps your brain form an interpretation of the visual image that is consistent with your vestibular sense: that you, the player, are sitting inside a room, and the movement of the battlefield in front of you is the tabletop getting moved around inside this war hall.

Independent visual backgrounds can be effective, but their unique behavior does not lend itself to easy implementation in just any VR app. In the previous examples, the user is in a very open environment that can remain stable, and has an in-universe explanation for its behavior. An IVB may be impossible to show the player in a small corridor, and without a story or explanation for why the skybox IVB is not responding to thumbstick input, the experience might seem confusing or even broken. Still, some developers are exploring creative ways to create a more general-use IVB so it might benefit a wider variety of VR experiences.

For example, Google Earth VR combines the IVB with a vignette/occlusion effect. Among their options for comfort, the user can use a setting where any time they move around the virtual Earth, the periphery of their field of view—instead of simply being darkened out or obscured—is replaced with a view of a simple, static environment consisting of a grid-lined floor plane and a skybox. The net effect is that the user’s brain can form the perception that they are standing in the stable VR environment, and the motion of the virtual Earth is simply occurring inside a portal or window in front of them. Digital Lode’s Espire does something similar whenever the player uses artificial locomotion, fading in a translucent, 3D grid surrounding the center of your field of view.

IVBs are easily confused with a simple cockpit, heads-up display (HUD), or other form of vignetting or occlusion. While IVBs can similarly serve to reduce the amount of visible optic flow on the screen, there are some key differences. As a general rule, they must afford to the user the interpretation that the on-screen motion is coming from the world/scenery moving around them rather than them moving through the world. A cockpit or HUD do little to foster this percept: the user still perceives themselves as moving, just while in a vehicle or while wearing a virtual HMD.

We do not yet know all the factors that can contribute to this interpretation, but the past several years of design experimentation have revealed some important observations. First, as one may expect, the amount of visual field occupied by optic flow vs. the IVB matters; users have to see the IVB for their brains to use it. At some point (which will vary by context and by user), the visual information signaling an IVB becomes insufficient, and vection can once again prevail as the conscious percept.

The perceived depth relationship between the IVB and the main foreground content can play a role as well, as defined by binocular disparity and occlusion depth cues. An IVB that is further away from the user than the part of the environment signalling optic flow better lends itself to the interpretation that the user is stationary in a stable room or environment in which objects are moving around them. If depth cues make the user perceive the IVB between themselves and the optic flow, it can be interpreted as a HUD or cockpit, which is less effective. There is still a lot to learn about what makes an effective vs. ineffective IVB, so we encourage design experimentation and user testing to see if an IVB can fit and improve the comfort in your VR experience.

Simulated Activities

Many developers and users find comfort benefits from controlling artificial locomotion through the re-enactment of physical activities, such as walking in place or pulling on the rungs of a ladder to climb (as opposed to using thumbstick or button inputs to accomplish the same experience). Of course, forcing the user to engage in approximations of physical activities creates risk for fatigue and accessibility issues if users do not have an alternative movement scheme they can use, so consider your options wisely.

There are many possible reasons for why physically simulating these activities might work to improve comfort. For simulated walking, it has been theorized that the proprioceptive and vestibular input might better align with the visual motion, reducing sensory conflict. Another possibility is that such sensory inputs introduce noise in your perceptual system that makes it relatively ambiguous for your brain as to whether you are actually walking through space or not. Mechanical vibrators that vibrate against the head to stimulate the vestibular organs operate on a similar principle. The effectiveness of such methods vary from person to person, based on how they happen to interpret and perceive all the sensory input.

In the case of climbing a ladder or world pulling, the user has more granular and predictable control of environmental movement than a simple thumbstick press. The act of grabbing and moving the environment with one’s hands and arms creates a strong sensorimotor feedback loop where the coherent motion the user sees is causally linked to the hand movements in a more tangible way than simply tilting a thumbstick. It also has the possibility of creating an alternative interpretation of your actions moving the environment geometry while you stay still, similar to an independent visual background. Games such as Crytek’s The Climb and Ready At Dawn’s Lone Echo use the world pulling technique to great effect.

Spatial Sound Effects

Environmental sound effects can be helpful for reducing disorientation when implementing a blink effect, or any others that occlude the environment. Imagine teleporting towards a loud vehicle that is on the other side of a ringing alarm. If the position of the listener changes with a blink effect, they will hear the alarm pass by, along with the sound of the approaching vehicle during the short time the screen is dark. This can help users orient themselves in the environment.

Check out the VR Audio Best Practice Guide for more on spatial sound design for your VR app.

More Locomotion Techniques and Best Practices

See below for the sections outlining the many design techniques and best practices to help inspire and inform your next VR locomotion system.