The Unity package provided with this blog post contains code that emulates a Gear VR Controller using a Touch controller. This functionality will allow you to iterate faster by previewing basic Gear VR Controller functionality directly in the Unity Editor without having to wait for a full mobile build.
How the simulated controller works
The Gear VR Controllers position is determined by an IK system. The source code for this system is included in the
Oculus Native Mobile SDK. Code relevant to determining the Gear VR Controllers position can be found in the following folder:
<sdk root>/VrSamples/Native/VrController/Src
This folder contains all of the code that is used to position the Gear VR Controller. The scripts contained in the bundle provided with this post port the functionality of OVR_Skeleton and OVR_ArmModel to C#. This allows the code to simulate the position and orientation of a Gear VR Controller given only the orientation of a Touch Controller, in C#.
Controller Mapping
The thumb stick of the Touch controller maps to the touch pad of the Gear VR Controller. Resting your finger on the thumb stick is the same as touching the touch pad without pressing it. Pressing the thumb stick is the same as pressing the touch pad. The coordinates of the simulated touch are mapped to the thumb stick as well.
The Touch trigger maps to the Gear VR Controller trigger. The Y or B button of the Touch controller maps to the back button of the Gear VR Controller. Which button is used depends on the handedness of the controller being simulated.
What's in the Package
There are three scripts provided:
OVRArmModel.cs
- Contains ported code from OVR_Skeleton and OVR_ArmModel. This code is responsible for simulating the position of the Gear VR Controller.OVRTrackedController.cs
- Static class exposing the functionality of the Gear VR Controller. This class provides the same output on a PC with Touch controllers as it does on a Gear VR with a Gear VR Controller.OVRTrackedControllerRig.cs
- When attached to an instance of the OVRCameraRig prefab, this script will position the Left or Right hand Tracker in the Unity editor the same way the tracker is positioned on a Gear VR device.
OVRTrackedController.cs
contains a static class you can write input code against. When running on a Rift with Touch Controllers this class will simulate the position and orientation of a Gear VR controller. When running on a Gear VR this class just wraps the Controller functionality. Because the OVRTrackedController
class works on both platforms, you only need to write input code against this class.
The OVRTrackedController
class has an Update
method, this method must be called once every frame. The Update
method is responsible for calculating the simulated position of the controller and dispatching callback methods.
This class contains the following public accessors which can be used to query the state of the current controller:
Quaternion LocalRotation
The local rotation of the controller, relative to the users body (tracking space). Vector3 LocalPosition
The local position of the controller, relative to the users body (tracking space).bool TriggerDown
True if the trigger is currently pressedbool TouchpadDown
True if the touch pad is currently pressed bool TouchpadTouched
True if the touch pad is touched, but not pressed (Works on headset as well)Vector2 TouchpadPosition
The normalized touch position on the touch pad (Works on headset as well)bool BackClicked
True for only one frame, when the back button is clicked (Works on headset as well)bool LeftHanded
True if the current controller is left handedbool RightHanded
True if the current controller is right handedOVRInput.Controller PhysicalController
Enumeration of the physical controller being used. When connected to a Rift this can be LTouch or RTouch. When running on a Gear VR headset this can be LTrackedRemote, RTrackedRemote or Touchpad.
The TrackedController
also exposes several callback events your code can register to. The handler functions for these events should return nothing and take no arguments, except for the OnTouch
event which should take a Vector2
as an argument:
OnTriggerDown
Called during the first frame that the trigger is pressedOnTouchpadDown
Called during the first frame that the touch pad is pressedOnTriggerUp
Called during the first frame the trigger is releasedOnTouchpadUp
Called during the first frame the touch pad is releasedOnBackClicked
Called during the frame that the back button was clickedOnTouch
Called every frame during which the users finger is making contact with the touch pad. The handler function should take a Vector2
as an argument, which is the normalized touch position on the touch pad.
The OVRTrackedControllerRig.cs
script is meant to be attached to an instance of the OVRCameraRig prefab (included with our Utilities for Unity package). When connected to a Rift, this script will override the transform of the Left or Right hand Anchor in a way that simulates the transform of a Gear VR Controller.
The script has two properties that need to be set in the editor. The first property Simulate Controller should be set to L Tracked Remote or R Tracked Remote. The value of this drop down menu determines which Touch Controller is used to simulate a left or right handed Gear VR Controller. The second property, Update Tracked Controller will call Update
on OVRTrackedController
if set to true. If you manually update the tracked controller elsewhere, uncheck this box.