Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

User Interface Components

Hands don’t come with buttons or switches the way other input devices do. To compensate for this, we experimented with components that provide continuous feedback throughout users interactions.

You can read more about affordances, signifiers, and feedback in our Best Practices section.


Buttons are one of the most common components out there, so we’ve done a lot of design and research investigating how to interact with them in the absence of tactile feedback.

To provide continuous feedback, buttons should change states throughout an interaction, starting as soon as the hand approaches.

Button States

Button States:

  1. Default: the body of the button with text
  2. Focus: a frame around the button provides a reference for movements of the button between states.
  3. Hover: the body of the button moves toward the finger as the finger approaches
  4. Contact: the button changes color or receives a touch mark when touched
  5. Action: the button moves backward through the frame when pressed
  6. Released: the body of the button moves back to its default state once the finger breaks contact


Near Field Buttons

Button Size: Buttons should be at least 6 centimeters or more on each axis. Targets smaller than 5 centimeters quickly drop off in accuracy and speed, and there’s somewhat of a plateau above 6 centimeters.

Collider Shape: Include a collider area just outside the button to confirm near-misses as selections. Squares or rectangles tend to perform better than rounded edges.

Layout Density: Allow 1 to 3 centimeters between buttons to maximize performance and minimize misses. Any more space than that will have a negative effect on accuracy.

Shape: The visual shape of a button has no effect on performance, as long as it’s confined in a rectangular collision boundary.

Distance: Buttons placed at 80% of a user’s farthest reach is ideal. Assuming a median arm length of 61 centimeters, the distance should be 49 centimeters from the center eye of the headset.

Far Field Buttons

Angular size: This is the size of the button regardless of distance, and should be no smaller than 1.6 degrees.

For far-field buttons, it’s important to make sure that targeting precision doesn’t suffer as a button gets further away. To do this, we calculate buttons in angular size, which measures the proportion of your field of view that the button takes up. For example, if a target is 2 meters away and its angular size is 1.6 degrees, the target diameter would be 5.6 centimeters. At 10 meters, the target diameter would be 28 centimeters.

Pinch-and-Pull Components

The pinch-and-pull handle became possible thanks to the introduction of the pinch as an interaction method. The handle can be pinched, held and then released to make precise selections. This component feels more satisfying than a virtual push button because there is no expectation for resistance when releasing an elastic string.

Pinch and drag

The pinch-and-pull handle was the foundation for a few additional components:


1D Picker These are components that move along one axis, like a slider, a simple menu, or a time scrubber for videos.

2D Picker Pulling the handle out can also allow users to make selections along 2 axes for things like a color picker or a radial selector. The full palette of options is revealed once the user pulls the handle outward, and the handle can be moved left, right, up, and down to select the desired option.

3D Picker The space opened up by pulling the handle can be used to provide further options, giving a user the ability to make selections across 3 axes. We experimented with a volumetric color picker, where the user can move the handle left, right, up and down to select colors, and forward or backward in space to select shade.


A short, more directional pointer can encourage users to focus on the cursor rather than the raycast, which improves accuracy when raycasting with hands.

Pointer 1

The pointer’s squishable shape signifies that you’re meant to pinch to interact with it, and its color, opacity and shape change throughout the open, almost-pinched and pinched states.



The cursor provides continuous feedback on the state of the pinch, which allows the user to focus on what they’re targeting instead of having to shift their gaze toward their hands. The fill, outline and scale of the cursor change throughout the open, almost-pinched and pinched states. The cursor also visually pops when the pinch occurs, and remains in the same state until the pinch is released.

Note: We designed our cursor with a semi-transparent fill to ensure that it doesn’t obscure targets, and a secondary dark outline to make sure it’s visible against bright backgrounds. The cursor inversely scales based on distance to maintain a consistent angular size.


The hands we designed consist of two elements — a static fill with a dynamic outline.


The dynamic outline changes color to provide continuous feedback between the open, almost-pinched, and pinch states.

System Pose When the palm faces the user to perform the system gesture, the outline is blue. The pinching fingers turn light blue as they start to pinch.

Pointing Pose When the hand is facing outwards and targeting an object, the outline is light grey. The pinching fingers turn white as they start to pinch.

Note: For the fill, we used a translucent dark grey with a fresnel-based opacity, which gives the hand presence in bright environments without obscuring what’s behind it. The fingers fully occlude each other when overlapping (or else you’d be staring down the inside of your fingers). Both the fill and outline fade out at the wrist, since the wrist’s angle isn’t tracked and regularly breaks immersion and presence.