Adding Gear VR Controller Support to Unity's UI
Oculus Developer Blog
Posted by Gabor Szauer
August 14, 2017

My last article covered adding Gear VR Controller support to the Unity VR Samples, which avoid using Unity’s UI System. This article explores adding Gear VR Controller support to Unity's UI System. Getting Unity’s UI system to work in VR has already been covered in this blog post. This post will be expand on that idea, modifying the finished project from the linked blog post to add Gear VR Controller support while maintaining support for the gaze pointer.

Required reading before you proceed:Unity’s UI System in VR


Start by downloading the project from the Unity UI in VR blog post.

This zip file contains a Unity project. Unzip and open with Unity. The project will have some compiler errors. This happens because the project depends on Oculus Utilities for Unity 5. Download the required Unity package from here and import the package into the project. Next, add an Oculus Signature File to the project.

Go to build settings and change the active platform to Android. Under Player Settings make sure Oculus is selected as a supported virtual reality SDK.

The demo should now compile and run on a Gear VR headset. At this point, only the Gaze Pointer is working, as expected.

Adding Gear VR Controller support

Open the VRPointers scene located at Assets/Scenes/VRPointers.unity. This scene uses the OVRCameraRig from the oculus utilities package. Add a new instance of the GearVRController prefab located at Assets/OVR/Prefabs/GearVrController.prefab as a child object of both the LeftHandAnchor and RightHandAnchor transforms of the camera rig.

The GearVRController prefab has a OVRGearVrController component attached to it. This component shows / hides the game object it is attached to based on the current active controller. Set the left and right hand prefabs controller type to L Tracked Remote and R Tracked Remote respectively.

Next, edit the OVRInputModule script located at Assets/Scripts/OVRInputModule.cs. This is the script responsible for providing “Mouse Events” trough gaze. It needs to be edited to provide the same events with a Gear VR Controller if one is present. First, add two new public variables to the script:

    [Header("Gear VR Controller")]
    public Transform trackingSpace;
    public LineRenderer lineRenderer;

The lineRenderer will be used to visualize where the Gear VR Controller is pointing, like a laser pointer. The trackingSpace transform is going to reference the tracking space of the camera rig. This transform is used to create the ray that the Gear VR Controller will be casting.

Find the GetGazePointerData function. The 4th line of code in this function creates a world space ray, it looks like this:

leftData.worldSpaceRay = new Ray (rayTransform.position, rayTransform.forward);

Replace the above line of code with the following:

OVRInput.Controller controller = OVRInput.GetConnectedControllers () & (OVRInput.Controller.LTrackedRemote | OVRInput.Controller.RTrackedRemote);
  if (lineRenderer != null) {
     lineRenderer.enabled = trackingSpace != null && controller != OVRInput.Controller.None;
if (trackingSpace != null && controller != OVRInput.Controller.None) {
    controller = ((controller & OVRInput.Controller.LTrackedRemote) != OVRInput.Controller.None) ? OVRInput.Controller.LTrackedRemote : 			OVRInput.Controller.RTrackedRemote;

    Quaternion orientation = OVRInput.GetLocalControllerRotation (controller);
    Vector3 localStartPoint = OVRInput.GetLocalControllerPosition (controller);
    Matrix4x4 localToWorld = trackingSpace.localToWorldMatrix;
    Vector3 worldStartPoint = localToWorld.MultiplyPoint (localStartPoint);
    Vector3 worldOrientation = localToWorld.MultiplyVector (orientation * Vector3.forward);
    leftData.worldSpaceRay = new Ray (worldStartPoint, worldOrientation);
    if (lineRenderer != null) {
      lineRenderer.SetPosition (0, worldStartPoint);
      lineRenderer.SetPosition (1, worldStartPoint + worldOrientation * 500.0f);
} else {
	leftData.worldSpaceRay = new Ray (rayTransform.position, rayTransform.forward);

The first line of code in the above block gets all of the active controllers and checks to see if a left or right handed Gear VR Controller is among them. The if statement which immediately follows enables or disables the line renderer. For the line renderer to be enabled, the tracking space transform must be set and a Gear VR Controller must be present.

The if statement which follows is responsible for generating and visualizing the ray cast by the Gear VR Controller. First, the controller variable is set to either the left or right remote. Next, the code finds the orientation and position of the controller local to the tracking space of the camera rig. The tracking space transform is used to find the world space orientation and rotation of the controller. The world space orientation and position of the controller is then used to create a new ray and to set up the line renderer to visualize this ray.

The else statement falls back to creating the input ray using gaze controls. This code remains unchanged. Save the changes made to this file, and locate the EventSystem game object in the scene, the script that was just modified is attached to this game object. Add a new LineRenderer component to the EventSystem game object. Configure the line renderer by disabling shadows, using the Blue material located at Assets/Materials/Blue.mat and setting the line width to 0.02.

Hook up the new public variable references on the OVRInputModule component of the EventSystem game object. The LineRenderer component is attached to the EventSystem game object. Use the TrackingSpace child transform of the OVRCameraRig game object for the tracking space transform.

At this point the project should compile and run on a Gear VR headset. If a Gear VR Controller is connected, a blue laser pointer should be visible. The blue laser will interact with the world space UI using the trigger on the controller. The only seemingly broken part of the experience is the blue gaze pointer ring. It is still attached to the gaze direction. This will be fixed in the next section.

Polishing things up

The first thing to polish is the gaze pointer ring. It should lie somewhere along the ray being cast by the Gear VR Controller. Open the OVRGazePointer script located at Assets/Scripts/OVRGazePointer.cs. Find the Update function in this script. The first line of code forces the gaze pointer to always be in front of the camera. Comment this line of code out:

transform.position = cameraRig.centerEyeAnchor.transform.position + cameraRig.centerEyeAnchor.transform.forward * depth;

That takes care of the gaze pointer. Using this pointer in any other demo, the line renderer cuts through the UI, which can be jarring. Instead if the line renderer hits a UI component, it should end where the UI was hit. This can be achieved by making further modifications to the OVRInputModule script located at Assets/Scripts/OVRInputModule.cs. Once again, locate the GetGazePointerData function. Inside this function, the following line of code is called in two places:


Find both instances of where the gaze pointer is requested to show and add the following bit of code under each instance:

if (lineRenderer != null) {
    lineRenderer.SetPosition (1, raycast.worldPosition);

This should now clamp the line renderer to any world point at which a UI was hit.

General Recap

The process for adding Gear VR Controller support to a gaze based application more or less follows these five steps:

  1. Locate where the gaze ray is being generated
  2. Confirm that there is a Gear VR Controller connected
    • Use the active controllers bit mask
  3. Create a ray using the Gear VR Controller
    • You will need to know the cameras tracking space for this
  4. Replace the gaze ray with the Gear VR Controller ray
    • Try to keep gaze as a fallback in case no controller is present
  5. Polish up parts of the experience that still rely on the gaze ray

**All source code in this post is made available under the examples license**