OVRInput exposes a unified input API for multiple controller types.
It is used to query virtual or raw controller state, such as buttons, thumbsticks, triggers, and capacitive touch data. It supports the Oculus Touch controllers.
For keyboard and mouse control, we recommend using the
UnityEngine.Input scripting API (see Unity’s Input scripting reference for more information).
Mobile input bindings are automatically added to
InputManager.asset if they do not already exist.
For more information, see OVRInput in the Unity Scripting Reference guide. For more information on Unity’s input system and Input Manager, see here: http://docs.unity3d.com/Manual/Input.html and http://docs.unity3d.com/ScriptReference/Input.html.
OVRInput.FixedUpdate()once per frame at the beginning of any component’s
OVRInput provides touch position and orientation data through
GetLocalControllerRotation(), which return a Vector3 and Quaternion, respectively.
Controller poses are returned by the tracking system and are predicted simultaneously with the headset. These poses are reported in the same coordinate frame as the headset, relative to the initial center eye pose, and can be used for rendering hands or objects in the 3D world. They are also reset by
OVRManager.display.RecenterPose(), similar to the head and eye poses.
Note: Oculus Touch controllers are differentiated with
Secondary in OVRInput:
Primary always refers to the left controller and
Secondary always refers to the right controller.
The primary usage of OVRInput is to access controller input state through
Get()queries the current state of a controller.
GetDown()queries if a controller was pressed this frame.
GetUp()queries if a controller was released this frame.
There are multiple variations of
Get() that provide access to different sets of controls. These sets of controls are exposed through enumerations defined by OVRInput as follows:
|Traditional buttons found on gamepads, Oculus Touch controllers, and back button.|
|Capacitive-sensitive control surfaces found on the Oculus Touch controller.|
|Proximity-sensitive control surfaces found on the first generation Oculus Touch controller. Not supported on subsequent generations.|
|One-dimensional controls such as triggers that report a floating point state.|
|Two-dimensional controls including thumbsticks. Reports a Vector2 state.|
A secondary set of enumerations mirrors the first, defined as follows:
The first set of enumerations provides a virtualized input mapping that is intended to assist developers with creating control schemes that work across different types of controllers. The second set of enumerations provides raw unmodified access to the underlying state of the controllers. We recommend using the first set of enumerations, since the virtual mapping provides useful functionality, as demonstrated below.
In addition to traditional gamepad buttons, the Oculus Touch controllers feature capacitive-sensitive control surfaces which detect when the user’s fingers or thumbs make physical contact (
Touch), as well as when they are in close proximity (
NearTouch). This allows for detecting several distinct states of a user’s interaction with a specific control surface. For example, if a user’s index finger is fully removed from a control surface, the
NearTouch for that control will report false. As the user’s finger approaches the control and gets within close proximity to it, the
NearTouch will report true prior to the user making physical contact. When the user makes physical contact, the
Touch for that control will report true. When the user pushes the index trigger down, the
Button for that control will report true. These distinct states can be used to accurately detect the user’s interaction with the controller and enable a variety of control schemes.
// returns true if the primary button (typically “A”) is currently pressed. OVRInput.Get(OVRInput.Button.One); // returns true if the primary button (typically “A”) was pressed this frame. OVRInput.GetDown(OVRInput.Button.One); // returns true if the “X” button was released this frame. OVRInput.GetUp(OVRInput.RawButton.X); // returns a Vector2 of the primary (typically the Left) thumbstick’s current state. // (X/Y range of -1.0f to 1.0f) OVRInput.Get(OVRInput.Axis2D.PrimaryThumbstick); // returns true if the primary thumbstick is currently pressed (clicked as a button) OVRInput.Get(OVRInput.Button.PrimaryThumbstick); // returns true if the primary thumbstick has been moved upwards more than halfway. // (Up/Down/Left/Right - Interpret the thumbstick as a D-pad). OVRInput.Get(OVRInput.Button.PrimaryThumbstickUp); // returns a float of the secondary (typically the Right) index finger trigger’s current state. // (range of 0.0f to 1.0f) OVRInput.Get(OVRInput.Axis1D.SecondaryIndexTrigger); // returns a float of the left index finger trigger’s current state. // (range of 0.0f to 1.0f) OVRInput.Get(OVRInput.RawAxis1D.LIndexTrigger); // returns true if the left index finger trigger has been pressed more than halfway. // (Interpret the trigger as a button). OVRInput.Get(OVRInput.RawButton.LIndexTrigger); // returns true if the secondary gamepad button, typically “B”, is currently touched by the user. OVRInput.Get(OVRInput.Touch.Two);
In addition to specifying a control,
Get() also takes an optional controller parameter. The list of supported controllers is defined by the
OVRInput.Controller enumeration (for details, refer to OVRInput in the Unity Scripting Reference guide.
Specifying a controller can be used if a particular control scheme is intended only for a certain controller type. If no controller parameter is provided to
Get(), the default is to use the
Active controller, which corresponds to the controller that most recently reported user input. For example, a user may use a pair of Oculus Touch controllers, set them down, and pick up an Xbox controller, in which case the Active controller will switch to the Xbox controller once the user provides input with it. The current Active controller can be queried with
OVRInput.GetActiveController() and a bitmask of all the connected Controllers can be queried with
// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.Touch); // returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller. OVRInput.Get(OVRInput.Axis1D.SecondaryHandTrigger, OVRInput.Controller.Touch);
Note: Oculus Touch controllers can be specified either as the combined pair (with
OVRInput.Controller.Touch), or individually (with
RTouch). This is significant because specifying LTouch or RTouch uses a different set of virtual input mappings that allow more convenient development of hand-agnostic input code.
// returns a float of the Hand Trigger’s current state on the Left Oculus Touch controller. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.LTouch); // returns a float of the Hand Trigger’s current state on the Right Oculus Touch controller. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, OVRInput.Controller.RTouch);
This can be taken a step further to allow the same code to be used for either hand by specifying the controller in a variable that is set externally, such as on a public variable in Unity Editor.
// public variable that can be set to LTouch or RTouch in the Unity Inspector public Controller controller; // returns a float of the Hand Trigger’s current state on the Oculus Touch controller // specified by the controller variable. OVRInput.Get(OVRInput.Axis1D.PrimaryHandTrigger, controller); // returns true if the primary button (“A” or “X”) is pressed on the Oculus Touch controller // specified by the controller variable. OVRInput.Get(OVRInput.Button.One, controller);
This is convenient since it avoids the common pattern of if/else checks for left or right hand input mappings.
The following diagrams illustrate common input mappings for Oculus Touch controllers. For more information on additional mappings that are available, refer to OVRInput in the Unity Scripting Reference guide.
When accessing the Oculus Touch controllers as a combined pair with
OVRInput.Controller.Touch, the virtual mapping closely matches the layout of a typical gamepad split across the left and right hands.
When accessing the left or right controller individually with
OVRInput.Controller.RTouch, the virtual mapping changes to allow for hand-agnostic input bindings. For example, the same script can dynamically query the left or right controller depending on which hand it is attached to, and
Button.One is mapped appropriately to either the A or X button.
The raw mapping directly exposes the controllers. The layout of the controllers closely matches the layout of a typical gamepad split across the left and right hands.