VR Comfort and Usability - Challenges and Considerations

It’s important to consider comfort and usability when planning your locomotion system. This guide discusses techniques related to making locomotion as comfortable as possible for your VR app users.

Locomotion Usability Overview: Challenges, Locomotion Types and Useful Techniques

The following chart lists several potential comfort and usability issues, and some of the techniques that can be used to improve the experience.

Comfort Risks and Usability IssuesAssociated Locomotion TypesUseful Techniques
VectionAvatar movement, Scripted movement, World pulling, Steering movementConsistent framerate, Quick turns, Snap turns, Independent Visual Background (IVB), Vignetting, Instant velocity changes
DisorientationTeleportationSpatial sound effects, Blinks, Warps
FatiguePhysical locomotion, World pullingArtificial locomotion, Support seated use
AccessibilityPhysical locomotionArtificial locomotion, Support seated use
Space LimitationsPhysical locomotionArtificial locomotion

Comfort Risks

A comfortable VR experience is generally achieved by minimizing sensory mismatches and discontinuities with our real world experience, and getting as many of our sensory processes to agree as possible. In this section, we provide the key comfort risks you should consider as you design your VR app for maximum user comfort.


Vection is the illusory perception of self-motion based on visual input consistent with such movements. This occurs most commonly when using artificial locomotion to move or turn in the virtual environment. The brain weighs the input from your sense of vision so heavily, you may even feel as if you are moving despite actually staying still. Discomfort can arise when vision conflicts with information coming from your vestibular sense or your sense of proprioception.

VR player stands in place with hands slightly pointed out to the right while avatar runs in same direction.

Vestibular Sense

Vestibular sense, often referred to as your sense of balance, is detected by a set of structures in your inner ear known as the vestibular organs. These work like a biological inertial measurement unit, similar to the one in your headset, to detect the direction of gravity and any other acceleration of your head along the six degrees of freedom of motion: yaw, pitch, roll, and x-, y-, and z-axis translation.

It is important to note that this sensory system specifically responds to any change in the head’s motion vector, either rotational or translational. This has two important implications: increasing and decreasing speed in any direction will stimulate the sensory receptors in the vestibular organs and the vestibular sense no longer signals motion to your brain when moving at a fixed velocity. This is because after moving at a fixed velocity for a sufficient period of time, the vestibular organs of the ear achieve constant momentum and no longer get stimulated. This is why many people find constant velocity motion in VR relatively comfortable.

Visual-Vestibular Mismatches and Comfort

Disagreements between your vision and vestibular system can lead to motion sickness. If you’ve ever gotten carsick from reading a book in a moving vehicle, or seasick on a boat when you couldn’t see the horizon outside, you have experienced the effects of a visual-vestibular mismatch. In those cases, vision tells your brain you are standing or sitting still while your vestibular sense tells your brain you are moving. Vection in VR creates the opposite but similarl discomforting experience: vision says you are moving, but your vestibular sense says you are staying still.

While different people have varying susceptibility to motion sickness, it is common for people to experience some degree of discomfort as a result of visual-vestibular mismatches when using VR, particularly when it is a new experience for the user. This can discourage them from returning to your experience or even turn them off of VR in general. Therefore, we highly recommend preventing or mitigating visual-vestibular mismatches as much as possible.


Proprioception refers to the perception or awareness of the position, movement, and extent of the body and its constituent parts. Your brain calculates this complex representation from sensory information about how different muscles are contracted or relaxed, the degree of flexion in different joints, and how touch receptors in your skin are or aren’t being activated. Proprioception is why you can, for example, close your eyes and touch your nose and otherwise know where and how your body is posed in space.

When the virtual representation of our body doesn’t match the mental model or perception of our physical body, this creates a visual-proprioceptive mismatch, which can be uncomfortable and negatively impact one’s feeling of immersion. Some common proprioceptive mismatches include when head or hand tracking doesn’t occur reliably, has too much latency, or poor inverse kinematics (IK) cause the user’s avatar body to take on a pose different from what their real body is doing. These are just some of the ways that what we see in VR may not match what we feel.


Disorientation occurs whenever the user loses track of their position in their environment. This can happen most commonly when the camera perspective suddenly changes significantly, and it takes a moment to reorient oneself within the world. This is associated with teleportation, snap turns, and any other discontinuities in the camera position or orientation.

VR player in adventure environment. Player teleports across level but is disoriented because the direction they’re facing has changed after teleporting.

Usability Issues

When building a locomotion system, it’s important to plan for the users who will be using it, their needs, and other individual factors that can affect how the locomotion system will work for them.

Space Limitations

While it may be tempting to simply require that users use a play space large enough to fit your virtual environment, this will drastically limit your potential audience. Some users need to play in stationary mode because they have just enough space to use VR so long as they stand or sit in one place. In practice, this means that unless the application is designed to be played from a stationary position without turning, it will need to support some kind of artificial locomotion or turning in order for everyone to be able to access the entire virtual environment.


Relying on physical locomotion to move around in your app can lead to user fatigue. Users sometimes start with a more physically active play style, but switch to a more relaxed style as gameplay progresses. While designing for continuous physical movement is not necessarily a bad thing, it should be a deliberate choice in your app design because it can limit the duration of play sessions. For some people, it may determine whether they can experience your app at all.

VR player leaning on couch, showing signs of fatigue.


Accessibility is important to consider when designing any experience. Some people may have physical needs that require them to stay seated, and others may simply prefer to stay seated during their VR experience. This can make certain physical actions frustratingly difficult or even impossible to execute. It can be helpful to consider a user’s physical limitations as they pertain to your controller system, how much dexterity will be required to perform certain actions, and how hand tracking might improve your app for people who have difficulty using hand controllers.

Check out the full guide, Designing for Accessibility, as this is a significant topic that impacts how you approach your app design.

Left: VR player sitting in wheelchair. Right: VR player sitting on couch.

The Importance of Predictability

People are less likely to have issues with discomfort if they are able to reliably predict how the camera will move through the virtual environment. This is one reason why we emphasize consistent, predictable control schemes and movement patterns here. Besides allowing the user to be more efficient in their artificial locomotion and experience less vection, visual accelerations that the user can anticipate will be less discomforting than ones they cannot.

A first-person experience can be more comfortable if the user is controlling a visible avatar in the virtual environment that telegraphs how the camera will move. For example, the avatar might start walking in some direction before the camera starts moving to follow them. When the avatar stops, the camera would then catch up and decelerate to a stop soon after. The avatar can give the user a signal of what the camera will be doing in as little as a few hundred milliseconds and sufficiently allow them to anticipate the changes in motion to a degree that will benefit comfort. Similarly, turning controls that always respond to input exactly the same way regardless of what else is happening are more predictable than those which disregard input when in the middle of a turn. When considered individually, these may seem to be relatively minor features, when everything responds consistently and predictably, the overall experience is likely to be more comfortable for the user.

What’s Next: Design Techniques and Best Practices

With the considerations and challenges outlined above, and the previous page on potential design challenges, it’s time to start thinking more tactically about how you will design your locomotion system. The following guide and sub-sections cover details on several design techniques and best practices that are implemented within the VR ecosystem: Locomotion Design Techniques and Best Practices.