Adding Gear VR Controller Support to the Unity VR Samples
Gabor Szauer
What is the first thing you try doing if you want to develop a VR game with Unity? Try the provided VR Samples! These awesome samples work out of the box for the most part. However, they only implement gaze controls in an effort to target the lowest common denominator.
This post focuses on adding support for the Gear VR Controller to the Unity VR Samples. This is achieved by adding a laser pointer for navigation, and modifying the sample games to work with the Gear VR Controller instead of gaze controls.
This post assumes you are already familiar with Unity as well as C#. The post is written using Unity 5.6.2f1. Some familiarity with the Oculus Unity SDK is also assumed, as this post will not be discussing the SDK API in detail. Before reading further, you should take a few minutes to familiarize yourself with the Unity VR Sample project by reading through this article.
* * *
SETUP
Start a new project by getting the VR Samples from the Unity Asset Store.
Download and import the Oculus Utilities for Unity 5 package into the project. As you import this plugin, you may be prompted to update to the most recent version. If prompted, go ahead and update. If you have previously installed a Utilities version in your project, be sure to delete the old assets specified here.
Change the build target to android. You can remove the VRSampleScenes/Scenes/Intro scene from the build.
* * *
COMPILE
Importing the VR Samples package should modify the project's build settings to support Oculus. Confirm this by making sure Virtual Reality Supported is enabled in Project Settings, and the Oculus platform is supported in Build Settings.
You will need to an Oculus Signature File to develop content for Gear VR. Instructions for generating a signature file are available on the OSIG Generator page. Once you have your OSIG file, place it in the Assets/Plugins/Android/assets folder of the Unity project. If this folder does not exist, create it.
At this point you should be able to build and run the VR Samples project on a Gear VR headset. Just make an android build and install it on a supported phone.
* * *
PERMISSIONS
The first issue that comes up when running the sample on a device is the permission dialogs:
There are two toast pop-ups asking for permission to manage photos and make phone calls. Disable these pop-ups by creating an android manifest file which skips the permissions dialogs:
If you run the project now, there should no longer be any visible permission dialogs.
* * *
MAIN MENU
The Oculus Utilities for Unity 5 that was imported into your project contains several scripts and prefabs which will help us speed up development. One of these prefabs is OVRCameraRig, which can be used as a drop in camera for any scene. Before moving on, it’s a good idea to get familiar with how the camera rig works.
The hierarchy of this prefab is as follows:
What each anchor object represents is self explanatory. There are cameras attached to the LeftEyeAnchor, RightEyeAnchor and CenterEyeAnchor objects. The CenterEyeAnchor camera is used to render in editor.
The OVRCameraRig object has two attached components, OVRCameraRig and OVRManager. The OVRCameraRig script positions all of the anchor objects, based on the name of each object. The >OVRManager script provides an easy way to configure VR parameters, and much more.
In VR, you should never move the camera directly! Instead, move the root object OVRCameraRig. You will also notice the presence of a TrackingSpace object. Tracking Space is a transform to which the eye and hand trackers are local. A TrackingSpace object should always be the parent of your VR Camera. Every tracker should be placed at the origin of Tracking Space.
Having a TrackingSpace transform is important even if you don’t use the OVRCameraRig prefab. The OVR API reports controller position and rotation in “local space”. Local space in this context is local to the Tracking space.
* * *
SETTING UP THE RIG
Let’s start by adding controller support to the main menu. Open the menu scene located at Assets/VRSampleScenes/Scenes/MainMenu.unity.
Create a new instance of the OVRCameraRig prefab, it is located at Assets/OVR/Prefabs/OVRCameraRig.prefab. Set the transform of the camera rig to origin, then set the Y position of the camera rig to 2. This matches the transform of the default camera which is in the scene.
Rename the MainCamera object to CenterEyeAnchor.
Parent the renamed CenterEyeAnchor (was MainCamera) game object from the root of the scene into the OVRCameraRig object.
Remove the original CenterEyeAnchor game object from the OVRCameraRig prefab (this will break the prefab instance, that’s ok).
Parent one GearVrController prefab located at OVR/Prefabs/GearVrController.prefab under both the LeftHandAnchor and RightHandAnchor game objects.
The GearVrController prefab contains a 3D model of the Gear VR Controller. This prefab has a OVRGearVRController script attached. The OVRGearVRController script shows or hides a game object based on the attached controller. To properly configure it set the right controllers type to R Tracked Remote and the left controller to L Tracked Remote.
* * *
ADDING A LASER
Now that there is a camera rig in the Main Menu scene, it’s time to add a “laser pointer”. Gaze controls will still work if the player does not have a Gear VR Controller paired.
Add a new LineRenderer component to CenterEyeAnchor. Configure the line renderer like so:
Turn off shadows
Use the MazeAgentPath material
Located at: VRSampleScenes/Materials/Maze/MazeAgentPath.mat
A width of 0.02
* * *
CODING THE LASER POINTER
Next, lets edit the VREyeRaycaster.cs script to support a laser pointer as well as gaze. This script is located at: VRStandardAssets/Scripts/VREyeRaycaster.cs. First add three new fields to this script.
The first field will be a reference to a line renderer component used to display the laser pointer.
The second field will control the visibility of the line renderer.
The third field is a reference to the tracking space of the camera rig. The laser pointer is relative to this space.
[SerializeField] private LineRenderer m_LineRenderer = null; // For supporting Laser Pointer
public bool ShowLineRenderer = true; // Laser pointer visibility
[SerializeField] private Transform m_TrackingSpace = null; // Tracking space (for line renderer)
Next, add two new accessors to this script. The first accessor will query weather or not a controller is connected. The second accessor will return the controller that is connected (if any). These accessors check all connected controllers, in case a Gear VR Controller is connected, but is not the active controller.
The above code uses OVRInput to determine if a controller is connected or not. The GetConnectedControllers function returns a bit mask that can be checked against the OVRInput.Controller enumeration.
Still editing VREyeRaycaster.cs, find the EyeRaycast function. Find where the ray cast and raycast hit are being created, add the new code after those lines.
In this spot, create two Vector3 variables which represent the start and end of the laser. Set the enabled state of the line renderer based on whether or not a controller is present
Right below that write the logic for the laser pointer. If there is a controller connected, create a new ray.
if (ControllerIsConnected && m_TrackingSpace != null) {
Matrix4x4 localToWorld = m_TrackingSpace.localToWorldMatrix;
Quaternion orientation = OVRInput.GetLocalControllerRotation (Controller);
Vector3 localStartPoint = OVRInput.GetLocalControllerPosition (Controller);
Vector3 localEndPoint = localStartPoint + ((orientation * Vector3.forward) * 500.0f);
worldStartPoint = localToWorld.MultiplyPoint(localStartPoint);
worldEndPoint = localToWorld.MultiplyPoint(localEndPoint);
// Create new ray
ray = new Ray(worldStartPoint, worldEndPoint - worldStartPoint);
}
In the above code, OVRInput.GetLocalControllerRotation and GetLocalControllerPosition gets the orientation and position of the Gear VR Controller in the same space as the camera is in (relative to tracking space). The orientation and position are converted to world space using the transform matrix of the tracking space.
If the ray has hit something, adjust the end point of the raycast for that. Find the if statement that handles the raycast, if a VRInteractiveItem was hit, set the world end point of the laser to be the hit point of the ray:
if (Physics.Raycast(ray, out hit, m_RayLength, ~m_ExclusionLayers)) {
VRInteractiveItem interactible = hit.collider.GetComponent<VRInteractiveItem>();//attempt to get the VRInteractiveItem on the hit object
// Above remains unchanged
if (interactible) {
worldEndPoint = hit.point;
}
// Below remains unchanged
m_CurrentInteractible = interactible;
Finally, at the end of the EyeRaycast function, after the if/else statement that handles the actual raycast set the start and end points of the line renderer to be the same as the start and end points of the ray being cast.
The way that the reticle works also needs to be changed. By default, if the gaze / laser doesn’t hit anything, the reticle is positioned where the player is looking. If a Gear VR Controller is paired, the reticle should be positioned where the laser ends.
Open Reticle.cs located at Assets/VRSampleScenes/Scripts/Utils/Reticle.cs and locate the SetPosition function which takes no arguments. Add two arguments to this function. The first argument will be a Vector3 which represents a point in space, the second argument will also be a Vector3 which will be a normalized direction.
Change the SetPosition function from:
// This overload of SetPosition is used when the the VREyeRaycaster hasn't hit anything.
public void SetPosition () {}
// Set the position of the reticle to the default distance in front of the camera.
m_ReticleTransform.position = m_Camera.position + m_Camera.forward * m_DefaultDistance;
// Set the scale based on the original and the distance from the camera.
m_ReticleTransform.localScale = m_OriginalScale * m_DefaultDistance;
// The rotation should just be the default.
m_ReticleTransform.localRotation = m_OriginalRotation;
}
into
public void SetPosition (Vector3 position, Vector3 forward) {
// Set the position of the reticle to the default distance in front of the camera.
m_ReticleTransform.position = position + forward * m_DefaultDistance;
// Set the scale based on the original and the distance from the camera.
m_ReticleTransform.localScale = m_OriginalScale * m_DefaultDistance;
// The rotation should just be the default.
m_ReticleTransform.localRotation = m_OriginalRotation;
}
At this point you should get a compiler error. To fix this error, you will need to edit VREyeRaycaster.cs located at VRStandardAssets/Scripts/VREyeRaycaster.cs again. Find where the SetPosition function of the reticle is being called with no arguments (near the end of the EyeRaycast function) and change the call from:
if(m_Reticle)
m_Reticle.SetPosition();
to:
if (m_Reticle)
m_Reticle.SetPosition (ray.origin, ray.direction);
* * *
HOOKING THINGS UP
In the MainMenu> scene, find the CenterEyeAnchor object, located in the hierarchy of the OVRCameraRig. Because this object used to be the default camera, it should already have a VREyeRaycaster component attached. We need to connect the new LineRenderer component (which is on the same game object) and the TrackingSpace transform.
You can apply the changes made to the prefab now named CenterEyeAnchor. The changes will be applied to the main camera prefab, located at: VRSampleScenes/Prefabs/Utils/MainCamera.prefab. This should ensure that any scenes which use this camera prefab will have a laser pointer.
The main menu now has support for both gaze selection, and laser pointer selection! You should be able to build and run the project and test out the menus new control scheme. When hovering over menu items, use the trigger of the controller to enter any game.
* * *
FLYER
Now that the main menu has laser support, lets modify the Flyer demo to work with the controller as well. Having a laser pointer does not make sense for this demo, but controlling the space ship with the controller instead of head tilting will make the game more comfortable to play.
This demo will approach integrating Gear VR Controller support differently. None of the prefabs from the Oculus framework are going to be used.
First, open the Flyer scene located at VRSampleScenes/Scenes/Flyer.unity. The movement of the ship in the flyer demo is controlled by FlyerMovementController.cs located at VRSAmpleScenes/Scripts/Flyer/FlyerMovementController.cs. This script sets the position of the ship based on the orientation of the users head. It needs to be changed to work based on the orientation of the Gear VR Controller
In FlyerMovementController.cs, create a new accessor for orientation. If a controller is connected, this accessor should return the local rotation of the controller. If no controller is connected, fall back to gaze controls by returning the orientation of the head (this is the default behavior).
If you applied the camera prefab in the last section, the MainCamera located in the scene at Cameras/FlyerVRCameraContainer/MainCamera should already have a LineRenderer component attached. Set the tracking space transform to the parent of the camera, FlyerVRCameraContainer.
As you can see in the above screen shot, the Y position of the main camera is 2. Remember, the main camera should be positioned at the origin of its tracking space! Fix this by changing the Y position of FlyerVRCameraContainer to 2, Particles to -2 and MainCamera to 0.
The LineRenderer is visible at all times when it should only be visible while the game is not playing. Let’s fix this by editing FlyerGameController.cs located at VRSampleScenes/Scripts/Flyer/FlyerGameController.cs. First, add a new reference to a VREyeRaycaster object.
In the PlayPhase coroutine hide the line renderer of the VREyeRaycaster when the play phase starts and show it when the phase ends.
private IEnumerator PlayPhase () {
if (m_EyeRaycaster != null>) {
m_EyeRaycaster.ShowLineRenderer = false;
}
// Rest of the function (here) is unchanged
if (m_EyeRaycaster != null) {
m_EyeRaycaster.ShowLineRenderer = true;
}
}
Finally, hook up the raycaster component reference in the editor. The FlyerGameController component is attached to the GameController game object in the scene, located at: System/GameController. It needs to reference the VREyeRaycaster component of the main camera.
There is one more thing you need to to, update the Input Manager. In the Main Menu, the OVRCameraRig had an OVRManager component. This component updated the input manager for you. Now that the OVRCameraRig is no longer used and there is no OVRManager in the scene, the input manager needs to be updated manually.
The best place to update the input manager is VRDeviceManager.cs located at VRStandardAssets/Scripts/VRDeviceManager.cs, this component is attached to a persistent game object. Add the following code to VRDeviceManager to ensure that the input manager is always being updated:
You should now be able to compile and play the Flyer demo using the Gear VR Controller
* * *
MAZE
Let’s get the Maze demo working next! Open the Maze scene located at VRSampleScenes/Scenes/Maze.unity.
Find the main camera of the scene, it is located at Maze/Cameras/VROrbitCamera/MainCamera. The prefab conection between this camera and the one that was edited in the main scene is broken. Hit the revert button on the prefab to fix the connection. After hitting revert, the camera should have a line renderer attached.
Next, you need to set the tracking space of the camera. The camera should always be at the origin of the tracking space, which this scene is not set up for. Make a new game object under Cameras/VROrbitCamera and name it TrackingSpace. This new game object should be a sibling of the MainCamera object. Give it the same transform as MainCamera (Position at x: 0, y: 9.45, z: -16.23).
Next, you want to re-parent the main camera so it is a child of the new tracking space. Once you have done this, the transform of the main camera should automatically be at the origin of the tracking space.
Finally, on the MainCamera game object, the VREyeRaycaster script has a “TrackingSpace” field. Set this to the transform of the TrackingSpace game object.
At this point the Maze demo should be playable using a Gear VR Controller.
* * *
TARGET GALLERY
Next, let’s get the Target Gallery working with the Gear VR Controller. Open the Shooter180 scene located in Assets/VRSampleScenes/Scenes/Shooter180.unity. First, you have to make a tracking space for the main camera. The default camera had a Y position of 1.5, the tracking space must match this.
Dont’ forget to hook up the tracking space transform of the VREyeRaycastercomponent attached to the MainCamera.
Similar to the Flyer demo, the laser pointer should only be visible when the game is not being played. To do this, edit ShootingGalleryController.cs located at VRSampleScenes/Scripts/ShootingGallery/ShootingGalleryController.cs. Add a new VR Eye Raycaster field to the script
Find the PlayPhase coroutine. At the start of the coroutine use the new Raycaster field to hide the line renderer. At the end of the coroutine show the line renderer.
private IEnumerator PlayPhase () {
if (m_EyeRaycaster !=null) {
m_EyeRaycaster.ShowLineRenderer = false;
}
// The body of PlayPhase here remains unchanged
if (m_EyeRaycaster != null) {
m_EyeRaycaster.ShowLineRenderer = true;
}
}
The ShootingGalleryControlle component is attached to the game object at the following path System/ShootingGalleryController. You need to assign the VR Eye Raycaster field.
The gun should be aligned to the laser pointer, not the gaze of the player. Orienting the gun will be similar to orienting the flyer demo. Edit the ShootingGalleryGun.cs script located at VRSampleScenes/Scripts/ShootingGallery/ShootingGalleryGun.cs. First, add an orientation accessor, similar to the one used for the flyer demo.
The target gallery sample should now work with full support for the Gear VR Controller.
* * *
TARGET ARENA
The Target Arena demo is going to require minimal changes, as it uses the same scripts as the Target Gallery demo. To begin, open the Shooter360 scene located at Assets/VRSampleScenes/Scenes/Shooter360.unity.
Create a tracking space for the main camera the same way it was done for the Target Gallery. This scene uses the ShootingGalleryController.cs script which was modified in the last section. It is attached to the game object located at System/ShootingGalleryController in the scene. You need to hook up the Eye Raycaster Component to the main camera.
The gun should already work as it uses the ShootingGalleryGun.cs script, which is the same script changed in the Target Gallery demo.
The Target Arena demo should now work with the Gear VR Controller.
* * *
BACK BUTTON
If you have been running each scene as it is being built, you may have noticed the back button on the controller does not behave as expected. Edit VRInput.cs located at VRStandardAssets/Scripts/VRInput.cs to recognize the controllers back button.
The following bit of code, near Line 112 handles the cancel button:
if (Input.GetButtonDown("Cancel")) {
if (OnCancel != null)
OnCancel();
}
Change the if statement to check if the back button is being pressed on either controller:
if (Input.GetButtonDown("Cancel") || OVRInput.GetDown (OVRInput.RawButton.Back, OVRInput.Controller.LTrackedRemote) || OVRInput.GetDown (OVRInput.RawButton.Back, OVRInput.Controller.RTrackedRemote)) {
if (OnCancel !=null)
OnCancel();
}
Unity
VRC
Explore more
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Strategies and Considerations for Designing Incredible MR/VR Experiences
Dive into design and rendering tips from GDC 2025. Hear from the teams behind hit games like Batman: Arkham Shadow and Demeo and learn how you can make your mixed reality and VR games more enjoyable and accessible.
Accessiblity, All, Apps, Avatars, Design, GDC, Games, Hand Tracking, Optimization, Quest, Unity
GDC 2025: Emerging Opportunities in MR/VR on Meta Horizon OS
Discover some of the opportunities we shared at GDC 2025 to accelerate your development process, level up your skillset and expand your reach with Meta Horizon OS.