User Input and Navigation

Overview

  • No traditional input method is ideal for VR, but gamepads are currently our best option; innovation and research are necessary and ongoing at Oculus.
  • Users can’t see their input devices while in the Rift; let them use a familiar controller that they can operate without sight.
  • Leverage the Rift’s sensors for control input (e.g., aiming with your head), but be careful of nauseating interactions between head movements and virtual motion.
  • Locomotion can create novel problems in VR.
  • Consider offering a “tank mode” style of movement that users can toggle. Include a means of resetting heading to the current direction of gaze.

Mouse, Keyboard, Gamepad

It’s important to realize that once users put on the Oculus Rift, they can’t see their keyboard, their mouse, their gamepad, or their monitor. Once they’re inside, interacting with these devices will be done by touch alone. Of course, this isn’t so unusual. We’re used to operating our input devices by touch, but we use sight to perform our initial orientation and corrections (such as changing hand position on a keyboard). This has important ramifications for interaction design. For instance, any use of the keyboard as a means of input is bound to be awkward, since the user will be unable to find individual keys or home position except by touch. A mouse will be a bit easier to use, as long as the user has a clear idea where the mouse is before putting on the headset.

Although still perhaps not the ultimate solution, gamepads are the most popular traditional controller at this time. The user can grip the gamepad with both hands and isn’t bound to ergonomic factors of using a more complicated control device on a desktop. The more familiar the controller, the more comfortable a user will be when using it without visual reference.

We believe gamepads are preferable over keyboard and mouse input. However, we must emphasize that neither input method is ideal for VR, and research is underway at Oculus to find innovative and intuitive ways of interacting with a wide breadth of VR content.

Alternative input methods

As an alternative to aiming with a mouse or controller, some VR content lets users aim with their head. For example, the user aims a reticle or cursor that is centered in whatever direction he or she is currently facing. Internally, we currently refer to this method as “ray-casting.” User testing at Oculus suggests ray-casting can be an intuitive and user-friendly interaction method, as long as the user has a clear targeting cursor (rendered at the depth of the object it is targeting) and adequate visual feedback indicating the effects of gaze direction. For example, if using this method for selecting items in a menu, elements should react to contact with the targeting reticle/cursor in a salient, visible way (e.g., animation, highlighting). Also keep in mind that targeting with head movements has limits on precision. In the case of menus, items should be large and well-spaced enough for users to accurately target them. Furthermore, users might move their heads without intending to change their target—for instance, if a tooltip appears peripherally outside a menu that is navigated by ray-casting. User testing is ultimately necessary to see if ray-casting fits your content.

The Rift sensors use information on orientation, acceleration, and position primarily to orient and control the virtual camera, but these readings can all be leveraged for unique control schemes, such as gaze- and head-/torso-controlled movement. For example, users might look in the direction they want to move, and lean forward to move in that direction. Although some content has implemented such control methods, their comfort and usability in comparison to traditional input methods are still unknown.

As a result, developers must assess any novel control scheme to ensure they do not unintentionally frustrate or discomfort novice users. For example, head tilt can seem like a reasonable control scheme in theory, but if a user is rotating in VR and tilts their head off the axis of rotation, this action creates a “pseudo coriolis effect.” Researchers have found the pseudo coriolis effect to consistently induce motion sickness in test subjects,[1] and therefore should be avoided in any head-tilt-based control scheme. Similar unintended effects may exist unknowingly inside your novel input method, highlighting the need to test it with users.

Navigation

For most users, locomotion will occur through some form of input rather than actually standing up and walking around. Common approaches simply carry over methods of navigation from current gen first-person games, either with a gamepad or keyboard and mouse. Unfortunately, traditional controls—while effective for navigating a video game environment—can sometimes cause discomfort in immersive VR. For example, the simulator sickness section above described issues with strafing and backwards walking that do not affect console and PC games. We are currently researching new control schemes for navigation in VR.

Alternative control schemes have been considered for improving user comfort during locomotion. Typically, pressing “forward” in traditional control schemes leads to moving in whatever direction the camera is pointed. However, developers might also use a “tank mode” or “tank view” for navigation, where input methods control the direction of locomotion, and the user controls the camera independently with head movements. For example, a user would keep walking along the same straight path as long as they are only pressing forward, and moving their head would allow them to look around the environment without affecting heading. One might liken this to browsing an aisle in a store—your legs follow a straight path down the aisle, but your head turns side to side to look around independently of where you are walking.

This alternative control scheme has its pros and cons. Some users in the Oculus office (and presumably the developers who have implemented them in extant content) find this method of control to be more comfortable than traditional navigation models. However, this can also introduce new issues with discomfort and user experience, particularly as the direction of the user’s head and the direction of locomotion can become misaligned—a user who wants to move straight forward in the direction they are looking may actually be moving at a diagonal heading just because their head and body are turned in their chair. Anyone using this method for navigation should therefore include an easy way for users to reset the heading of the “tank” to match the user’s direction of gaze, such as clicking in an analog stick or pressing a button.

Further research is necessary to fully determine the comfort and effectiveness of “tank mode” under different use cases, but it represents an alternative to traditional control schemes that developers might consider as a user-selectable option.

For now, traditional input methods are a familiar and accessible option for most users, as long as developers are mindful of avoiding known issues we have described in this guide.

Some content also lends itself to alternative means of moving the player around in a virtual space. For instance, a user might progress through different levels, each of which starts in a new location. Some games fade to black to convey the player falling asleep or losing consciousness, and then have them awaken somewhere else as part of the narrative. These conventions can be carried over to VR with little issue; however, it is important to note that applying changes to the user’s location in the virtual space outside their control (e.g., a jump in perspective 90° to the right, moving them to another location in the same map) can be disorienting and, depending on the accompanying visuals, potentially uncomfortable.

[1] Dichgans, J. & Brandt, T. (1973). Optokinetic motion sickness and pseudo-coriolis effects induced by moving visual stimuli. Acta Oto-laryngologica, 76, 339-348.