OVRInput exposes a unified input API for multiple controller types. It may be used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive touch data. It currently supports the Oculus Touch, Microsoft Xbox controllers, and the Oculus remote on desktop platforms. For mobile development, it supports the Gear VR Controller as well as the touchpad and back button on the Gear VR headset. Gear VR gamepads must be Android compatible and support Bluetooth 3.0.

For keyboard and mouse control, we recommend using the UnityEngine.Input scripting API (see Unity’s Input scripting reference for more information).

Mobile input bindings are automatically added to InputManager.asset if they do not already exist.

For more information, see OVRInput in the Unity Scripting Reference. For more information on Unity’s input system and Input Manager, documented here: http://docs.unity3d.com/Manual/Input.html and http://docs.unity3d.com/ScriptReference/Input.html.

SetControllerVibration() support for Oculus Touch is now deprecated; please use OVRHaptics for Oculus Touch instead.


To use OVRInput, you must either:

  1. Include an instance of OVRManger anywhere in your scene; or
  2. Call OVRInput.Update() and OVRInput.FixedUpdate() once per frame at the beginning of any component’s Update and FixedUpdate methods, respectively.

Oculus Touch Tracking

OVRInput provides Touch position and orientation data through GetLocalControllerPosition() and GetLocalControllerRotation(), which return a Vector3 and Quaternion, respectively.

Controller poses are returned by the constellation tracking system and are predicted simultaneously with the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial center eye pose, and may be used for rendering hands or objects in the 3D world. They are also reset by OVRManager.display.RecenterPose(), similar to the head and eye poses.

Gear VR Controller

Gear VR Controller provides orientation data through GetLocalControllerRotation(), which returns a quaternion.

Gear VR positions the controller relative to the user by using a body model to estimate the controller’s position. Whether the controller is visualized on the left or right side of the body is determined by left-handedness versus right-handedness, which is specified by users during controller pairing.

To query handedness of a paired controller, use IsControllerConnected() or GetActiveController() to query for RTrackedRemote or LTrackedRemote.

For example:

// returns true if right-handed controller connected

Use OVRInput.Get() to query controller touchpad input. You may query the input position with Axis2D:

OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad, OVRInput.Controller.RTrackedRemote);

A touchpad touch occurs when the user’s finger makes contact with the touchpad without actively clicking it. Touches may be queried with OVRInput.Get(OVRInput.Touch.PrimaryTouchpad). Touchpad clicks are alias to virtual button One clicks, and may be queried with OVRInput.Get(OVRInput.Button.PrimaryTouchpad).

To recenter a Gear VR Controller, use OVRInput.RecenterController().

The volume and home buttons are reserved.

OVRInput Usage

The primary usage of OVRInput is to access controller input state through Get(), GetDown(), and GetUp().

  • Get() queries the current state of a control.
  • GetDown() queries if a control was pressed this frame.
  • GetUp() queries if a control was released this frame.

Gear VR Controller Swiping Gestures

For Gear VR Controllers, the user interface of your VR experience should follow these natural scrolling and swiping gestures:

  • Swipe up: Pull content upward. Equivalent to scrolling down.
  • Swipe down: Pull content downward. Equivalent to scrolling up.
  • Swipe left: Pull content left or go to the next item or page.
  • Swipe right: Pull content right or go to the previous item or page.

Control Input Enumerations

There are multiple variations of Get() that provide access to different sets of controls. These sets of controls are exposed through enumerations defined by OVRInput as follows:



Traditional buttons found on gamepads, Touch controllers, the Gear VR Controller touchpad and back button, and the Gear VR headset touchpad and back button.


Capacitive-sensitive control surfaces found on the Oculus Touch and Gear VR Controller.


Proximity-sensitive control surfaces found on the Oculus Touch controllers.


One-dimensional controls such as triggers that report a floating point state.


Two-dimensional controls including thumbsticks and the Gear VR Controller touchpad. Report a Vector2 state.

A secondary set of enumerations mirror the first, defined as follows:






The first set of enumerations provides a virtualized input mapping that is intended to assist developers with creating control schemes that work across different types of controllers. The second set of enumerations provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of enumerations, since the virtual mapping provides useful functionality, as demonstrated below.

Button, Touch, and NearTouch

In addition to traditional gamepad buttons, the Oculus Touch controllers feature capacitive-sensitive control surfaces which detect when the user's fingers or thumbs make physical contact (a “touch”), as well as when they are in close proximity (a “near touch”). This allows for detecting several distinct states of a user’s interaction with a specific control surface. For example, if a user’s index finger is fully removed from a control surface, the NearTouch for that control will report false. As the user’s finger approaches the control and gets within close proximity to it, the NearTouch will report true prior to the user making physical contact. When the user makes physical contact, the Touch for that control will report true. When the user pushes the index trigger down, the Button for that control will report true. These distinct states can be used to accurately detect the user’s interaction with the controller and enable a variety of control schemes.

The Gear VR Controller touchpad may be queried for both touch status and click status, where “touch” refers to the user’s finger making contact with the touchpad without actively clicking it.

Example Usage

// returns true if the primary button (typically “A”) is currently pressed.

// returns true if the primary button (typically “A”) was pressed this frame.

// returns true if the “X” button was released this frame.

// returns a Vector2 of the primary (typically the Left) thumbstick’s current state. 
// (X/Y range of -1.0f to 1.0f)

// returns true if the primary thumbstick is currently pressed (clicked as a button)

// returns true if the primary thumbstick has been moved upwards more than halfway.  
// (Up/Down/Left/Right - Interpret the thumbstick as a D-pad).

// returns a float of the secondary (typically the Right) index finger trigger’s current state.  
// (range of 0.0f to 1.0f)

// returns a float of the left index finger trigger’s current state.  
// (range of 0.0f to 1.0f)

// returns true if the left index finger trigger has been pressed more than halfway.  
// (Interpret the trigger as a button).

// returns true if the secondary gamepad button, typically “B”, is currently touched by the user.
// returns true after a Gear VR touchpad tap
// returns true on the frame when a user’s finger pulled off Gear VR touchpad controller on a swipe down
// returns true the frame AFTER user’s finger pulled off Gear VR touchpad controller on a swipe right
// returns true if the Gear VR back button is pressed

// Returns true if the the Gear VR Controller trigger is pressed down

// Queries active Gear VR Controller touchpad click position 
// (normalized to a -1.0, 1.0 range, where -1.0, -1.0 is the lower-left corner)
OVRInput.Get(OVRInput.Axis2D.PrimaryTouchpad, OVRInput.Controller.RTrackedRemote);

// If no controller is specified, queries the touchpad position of the active Gear VR Controller

// returns true if the Gear VR Controller back button is pressed

// recenters the active Gear VR Controller. Has no effect for other controller types.

// recenters right Gear VR Controller (even if it is not active)

// returns true on the frame when a user’s finger pulled off Gear VR Controller back button

In addition to specifying a control, Get() also takes an optional controller parameter. The list of supported controllers is defined by the OVRInput.Controller enumeration (for details, refer to OVRInput in the Unity Scripting Reference.

Specifying a controller can be used if a particular control scheme is intended only for a certain controller type. If no controller parameter is provided to Get(), the default is to use the Active controller, which corresponds to the controller that most recently reported user input. For example, a user may use a pair of Oculus Touch controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to the Xbox controller once the user provides input with it. The current Active controller can be queried with OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with OVRInput.GetConnectedControllers().

Example Usage:

// returns true if the Xbox controller’s D-pad is pressed up.
OVRInput.Get(OVRInput.Button.DpadUp, OVRInput.Controller.Gamepad); 

// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch); 

// returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);

Querying the controller type can also be useful for distinguishing between equivalent buttons on different controllers. For example, if you want code to execute on input from a gamepad or Touch controller, but not on a Gear VR Touchpad, you could implement it as follows:

if (OVRInput.GetActiveController() != OVRInput.Controller.Touchpad) { /* do input handling */ }

Note that the Oculus Touch controllers may be specified either as the combined pair (with OVRInput.Controller.Touch), or individually (with OVRInput.Controller.LTouch and RTouch). This is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow more convenient development of hand-agnostic input code. See the virtual mapping diagrams in Touch Input Mapping for an illustration.

Example Usage:

// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch);

// returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);

This can be taken a step further to allow the same code to be used for either hand by specifying the controller in a variable that is set externally, such as on a public variable in the Unity Editor.

Example Usage:

// public variable that can be set to LTouch or RTouch in the Unity Inspector
public Controller controller; 
// returns a float of the Hand Trigger’s current state on the Oculus Touch controller  
// specified by the controller variable.
OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller);

// returns true if the primary button (“A” or “X”) is pressed on the Oculus Touch controller
// specified by the controller variable.
OVRInput.Get(OVRInput.Button.One, controller); 

This is convenient since it avoids the common pattern of if/else checks for Left/Right hand input mappings.

Touch Input Mapping

The following diagrams illustrate common input mappings for Oculus Touch controllers. For more information on additional mappings that are available, refer to OVRInput in the Unity Scripting Reference.

Virtual Mapping (Accessed as a Combined Controller)

When accessing the Touch controllers as a combined pair with OVRInput.Controller.Touch, the virtual mapping closely matches the layout of a typical gamepad split across the Left and Right hands.

Virtual Mapping (Accessed as Individual Controllers)

When accessing the Left or Right Touch controllers individually with OVRInput.Controller.LTouch or OVRInput.Controller.RTouch, the virtual mapping changes to allow for hand-agnostic input bindings. For example, the same script can dynamically query the Left or Right Touch controller depending on which hand it is attached to, and Button.One will be mapped appropriately to either the A or X button.

Raw Mapping

The raw mapping directly exposes the Touch controllers. The layout of the Touch controllers closely matches the layout of a typical gamepad split across the Left and Right hands.

Rift Remote Input Mapping

Virtual Mapping

Raw Mapping

Xbox Input Handling

Virtual Mapping

This diagram shows a common implementation of Xbox controller input bindings using OVRInput.Controller.Gamepad.

Raw Mapping

The raw mapping directly exposes the Xbox controller.