Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

Designing for Hands

Hands are a promising new input method, but there are limitations to what we can implement today due to computer vision and tracking constraints. The following design guidelines enable you to create content that works within these limitations.

In these guidelines, you’ll find interactions, components, and best practices we’ve validated through researching, testing, and designing with hands. We also included the principles that guided our process. This information is by no means exhaustive, but should provide a good starting point so you can build on what we’ve learned so far. We hope this helps you design experiences that push the boundaries of what hands can do in virtual reality.

The Benefits

People have been looking forward to hand tracking for a long time, and for good reason. There are a number of things that make hands a preferable input modality to end users.

  • Hands are a highly approachable and low-friction input that require no additional hardware
  • Unlike other input devices, they are automatically present as soon as you put on a headset
  • Self and social presence are more rich in experiences where you’re able to use your real hands
  • Your hands aren’t holding anything, leaving them free to make adjustments to physical objects like your headset

The Challenges

There are some complications that come up when designing experiences for hands. Thanks to sci-fi movies and TV shows, people have exaggerated expectations of what hands can do in VR. But even expecting your virtual hands to work the same way your real hands do is currently unrealistic for a few reasons.

  • There are inherent technological limitations, like limited tracking volume and issues with occlusion
  • Virtual objects don’t provide the tactile feedback that we rely on when interacting with real-life objects
  • Choosing hand gestures that activate the system without accidental triggers can be difficult, since hands form all sorts of poses throughout the course of regular conversation

You can find solutions we found for some of these challenges in our Best Practices section.

The Capabilities

To be an effective input modality, hands need to allow for the following interaction primitives, or basic tasks:

  • Targeting, which moves focus to a specific object
  • Selection, which lets users choose or activate that object
  • Manipulation, or moving, rotating, or scaling the object in space

These interactions can be performed directly, using your hands as you might in real life to poke and pinch at items, or they can be performed through raycasting, which directs a raycast at objects or two-dimensional panels.

You can find more of our thinking in our Interactions section.

Today, human ergonomics, technological constraints and disproportionate user expectations all make for challenging design problems. But hand tracking has the potential to fundamentally change the way people interact with the virtual world around them. We can’t wait to see the solutions you come up with.