VR Accessibility Design: Controller Mapping, Input, and Feedback

With Oculus VR HMDs, we provide controllers that allow developers to map certain functions to different buttons, send haptic cues, and track arm and hand movements. It’s important to be cognizant of how people will use these controllers and buttons in your application. In some cases, users may not be able to use all of their fingers or may have limited range of motion in their wrists, fingers, or arms. A user’s visual capabilities may also impact how well they’re able to understand and use the controller system you’ve created. Also, remember to keep in mind that the best way to check if your controller configurations and interactions in-application are accessible is to test them with as many users as possible--including users with various disabilities.

In this section we provide a number of recommendations and best practices to ensure that you are maximizing your app’s accessibility when it comes to your controller system, inside your app, and in the physical world.

This document contains the following sections:

Provide option to select a dominant hand

Perhaps the most simple accessibility concept to implement is handedness, or the ability to choose which hand in the VR experience is a user’s dominant hand. It should come as no surprise that someone who is physically left handed would prefer that this is also reflected in your VR app. We generally recommend enabling this option within your settings menu, or ideally within your initial app tutorial.

Use haptic feedback for additional clarity

Haptic feedback and controller vibrations can be used to add interactivity, enhance your design, and deepen a player’s immersion. As much as possible, implement vibrations/haptics so they’re easily distinguishable when they need to communicate different things. Use timing, duration, and intensity to create differentiation.

Here is a snapshot of what your haptic feedback could represent:

  • An alert or warning.
  • Controller intersection with virtual objects and colliders.
  • A signal of left and right directionality.
  • Button clicks or selections (correct or incorrect). See the Warnings section in Design module that follows.
  • In-game sounds, i.e. large crashes, drum beats, etc.

With these in mind, you might add controller vibrations to alert the user of a nearby enemy in the case they are unable to hear the approaching footsteps, or when a user is touched by a villain so they physically can feel the world around you.

It’s important to note that while adding vibrations is a good way to provide extra context for players, it can also lead to over-stimulation. Consider offering the ability to enable/disable haptic feedback within your Options menu.

Add ray casts to simplify navigation and selection

A ray cast is a ray of light that shoots out from the user’s controller, indicating where the controller is pointing in the 360 space. Ray casts can assist users who have difficulty with low visibility, and perhaps are unable to see directly where their controllers are pointing without the addition of a bright color/light illuminating the direction. These visual guides can also support players who have difficulty with spatial depth and gauging distance in a virtual world. If you decide to include ray casts in your app, we recommend enabling users to adjust the color, size, and/or shape of their ray casts so they can navigate the world on their own terms.

Check out Facebook Horizon or simply go to Oculus Home for examples of raycasting design. The following image also shows an example of a ray cast.

ray cast

Visually represent controller inputs with button highlights

Many of your users may not be acclimated to the button locations on the Oculus touch controllers, or have difficulty understanding which buttons perform which actions. To help these users become confident with using the controllers within your app, add a light or outlining color to highlight the buttons that have to be pressed for a certain interaction.

You might also render the controllers so that the user has a reference for how the controller is positioned in the real world. This can be particularly helpful within the tutorial portion of your app, or in situations where a controller has to be turned or moved in the space to complete an interaction.

The clip below is from the survival horror game Lies Beneath, where the team at Drifter used both of the techniques outlined above for their tutorial. Both controllers are fully rendered, the trigger buttons are highlighted, and a text description of the interaction is clearly displayed.

From Lies Beneath tutorial, user holds up controllers and views buttons highlight as text instructions display in front of user

Offer the option to personalize controller configurations

Controller configurations and button mapping can make or break a VR application. As a key takeaway, simplify control schemes in your application to improve response time and user experience.

When mapping your controllers, consider all the possible button configurations, as some users may navigate through the app on a single controller or with limited mobility. One solution to help make your application more accessible is to enable people to customize their controller configuration. This gives people the agency to interact with your app experience in the way that is most comfortable to them, but still retains your intended control scheme.

Controller configuration, like any other setting, should be savable so that users don’t lose these preferences. Consider the user experience: after spending the time to customize these controls, your audience does not want to re-enter this information simply because they have updated their hardware or software.

Minimize the complexity of your controller scheme

In addition to making your controller configuration customizable you should also consider minimizing the number of controls. If your controls are too complicated, people with physical, cognitive, or learning disabilities may experience delayed response times. Simplifying these controls by decreasing the amount of buttons needed to progress or even reducing the actions down to one controller enables people to react more quickly with minimal cognitive effort.

As an example, you could minimize your controls to one button for interacting with objects. This way a user knows that all interactions with objects require a single button press. They don’t have to discern between the X button for opening crates and the Y button for searching them, the grips for holding objects, and the trigger for launching them, etc.

Minimize button press requirements

As you explore controller configurations, take into account that not everyone can hold the controllers comfortably or for the entire duration of the experience or can consistently perform button presses/holds. Ask yourself, how many times does a player have to press a button on the controller to finish a level? If a player needs to hold down a button, how long do they have to hold it down? If you have an app that naturally requires repetitive button pressing, maybe offer a configuration that adjusts this requirement.

Be sure to consider hand tracking

Oculus hand tracking can provide a multitude of functions for the developer or the user. As an interaction method, it can cut out the need for a controller for people who are unable to use them or have difficulty with them. As a tool to communicate, it can help people who are experiencing hearing loss or who are deaf use sign language. Some of the key helpful factors of hand tracking for accessibility include the fact that hands are a highly approachable and low-friction input that require no additional hardware and that unlike other input devices, they are automatically present as soon as you put on a headset. We recommend exploring the Designing for Hands documentation to find creative ways to use hand tracking for accessible design.

Develop with voice input to further minimize user friction

In addition to hand tracking, voice input is another interaction method that cuts out the need for additional hardware outside of the headset. Voice input is a great method of interaction to design for if you want to make sure a wider audience can progress through your application. Keep in mind that some users will rely on this (typically those with physical disabilities) and others will need an alternative (those with speech impairment) so use voice as an additional control channel, not the only one for controlling actions.