Add Camera Rig Using OVRCameraRig

The Oculus Integration SDK contains the OVRCameraRig prefab that provides the transform object to represent the Oculus tracking space. It contains a tracking space game object to fine-tune the relationship between the head tracking reference frame and your world. Under the tracking space object, you will find a center eye anchor, which is the main Unity camera, two anchor game objects for each eye, and left and right hand anchors for controllers. It also contains a custom VR camera, which replaces Unity’s conventional camera.

This topic contains the following sections:

How Does This Work?

When you enable VR support in Unity, your headset automatically passes the head and positional tracking reference to Unity. This lets the camera position and orientation finely match with the user position and orientation in the real world. The head-tracked pose values overrides the camera’s transform values, which means the camera is always in a position relative to the player object.

In a typical first- or third-person setup, instead of having a stationary camera, you may want the camera to follow or track the player object. The player object can be a character in motion, such as an avatar, a car, or a gun turret. To move the camera to follow the player object, you can either make the camera a child of the player object or have an object track the player, and the camera, in turn, follows that object. Based on your app design, you may want to create a script that references the player object and attach the script to OVRCameraRig.

Add OVRCameraRig in the Scene

OVRCameraRig is a replacement to Unity’s main camera, which means you can safely delete Unity’s main camera from the Hierarchy tab. The primary benefit of using OVRCameraRig is that it provides access to OVRManager, which provides the main interface to the VR hardware. Before you add OVRCameraRig, make sure you have downladed the Oculus Integration SDK and enabled the VR support.

  1. From the Hierarchy tab, right-click Main Camera, and click Delete.
  2. In the Project tab, expand the Assets > Oculus > VR > Prefab folder, and drag and drop the OVRCameraRig prefab into the scene. You can also drag and drop it in the Hierarchy tab.

Configure Settings

There are two main scripts attached to the OVRCameraRig prefab: OVRCameraRig.cs and OVRManager.cs. Each script provides settings for camera, display, tracking, quality, and performance of your app.

To begin with settings, in the Hierarchy tab, select OVRCameraRig, and in the Inspector tab, review the following settings:.

OVRCameraRig Settings

OVRCameraRig.cs is a component that controls stereo rendering and head tracking. It maintains three child anchor transforms at the poses of the left and right eyes, as well as a virtual center eye that is halfway between them. It is the main interface between Unity and the cameras. and attached to a prefab that makes it easy to add comfortable VR support to a scene.

Note: All camera control should be done through this component.

  • Use Per Eye Camera: Select this option to use separate cameras for left and right eyes.
  • Use Fixed Update For Tracking: Select this option to update all the tracked anchors in the FixedUpdate() method instead of Update() method to favor physics fidelity. However, if the fixed update rate doesn’t match the rendering frame rate, which is derived by using OVRManager.display.appFramerate, the anchors visibly judder.
  • Disable Eye Anchor Cameras: Select this option to disable the cameras on the eye anchors. In this case, the main camera of the game is used to provide the VR rendering and the tracking space anchors are updated to provide reference poses.

OVRManager Settings

OVRManager.cs is the main interface to the VR hardware and is added to the OVRCameraRig prefab. It is a singleton that exposes the Oculus SDK to Unity, and includes helper functions that use the stored Oculus variables to help configure the camera behavior. It can be a part of any app object and should only be declared once.

Target Devices

All apps that target Oculus Quest are automatically compatible to run on Oculus Quest 2. However, when you query the headset type that the app is running on, Oculus returns Oculus Quest even if the headset is Oculus Quest 2 for the best compatibility. If you precisely want to identify the headset type, select Oculus Quest and Oculus Quest 2 as target devices. In this case, when you query the headset type, Oculus returns the exact headset that the app is running on. Based on the target headset, Oculus automatically adds the <meta-data android:name="com.oculus.supportedDevices" android:value="quest" /> element for Oculus Quest, and the <meta-data android:name="com.oculus.supportedDevices" android:value="quest|quest2" /> element for both Oculus Quest and Oculus Quest 2 in the Android Manifest file. There is no need to update the Android Manifest file manually.

When apps target both the headsets, check the headset type to optimize the app. Call OVRManager.systemHeadsetType() to return the headset type that the app is running on. For example, the method returns Oculus_Quest or Oculus_Quest_2 depending on the headset type.

Performance and Quality

  • Use Recommended MSAA Level: True, by default. Select this option to let OVRManager automatically choose the appropriate MSAA level based on the Oculus device. For example, the MSAA level is set to 4x for Oculus Quest. Currently supported only for Unity’s built-in render pipeline.

    Note: For Universal Render Pipeline (URP), manually set the MSAA level to 4x. We are aware of this issue that URP does not set the MSAA level automatically. We will announce the fix in the Release Notes page, when available.

  • Monoscopic: If true, both eyes see the same image rendered from the center eye pose, saving performance on low-end devices. We do not recommend using this setting as it doesn’t provide the correct experience in VR.

  • Enable Adaptive Resolution: Enable to configure app resolution to scale down as GPU exceeds 85% utilization, and to scale up as it falls below 85% (range 0.5 - 2.0; 1 = normal density). To minimize the perceived artifacts from changing resolution, there is a two second minimum delay between every resolution change.
  • Min Render Scale: Sets minimum bound for Adaptive Resolution (default value is 0.7).
  • Max Render Scale (Rift only): Sets maximum bound for Adaptive Resolution (default value is 1.0).
  • Head Pose Relative Offset Rotation: Sets the relative offset rotation of head poses.
  • Head Pose Relative Offset Translation: Sets the relative offset translation of head poses.
  • Profiler TCP Port: The TCP listening port of Oculus Profiler Service, which is activated in debug or development builds. When the app is running on editor or device, go to Tools > Oculus > Oculus Profiler Panel to view the real-time system metrics.


  • Tracking Origin Type: Sets the tracking origin type.

    Eye Level tracks the position and orientation relative to the device’s position.

    Floor Level tracks the position and orientation relative to the floor, based on the user’s standing height as specified in the Oculus Configuration Utility.

  • Use Positional Tracking: When enabled, head tracking affects the position of the virtual cameras.
  • Use IPD in Positional Tracking: When enabled, the distance between the user’s eyes affects the position of each OVRCameraRig’s cameras.
  • Reset Tracker on Load: When enabled, each scene causes the head pose to reset. When disabled, subsequent scene loads do not reset the tracker. This keeps the tracker orientation the same from scene to scene, as well as keep magnetometer settings intact.
  • Allow Recenter: Select this option to reset the pose when the user clicks the Reset View option from the universal menu. You should select this option for apps with a stationary position in the virtual world and allow the Reset View option to place the user back to a predefined location (such as a cockpit seat). Do not select this option if you have a locomotion system because resetting the view effectively teleports the user to potentially invalid locations.

For Oculus Rift, OVRManager.display.RecenterPose() recenters the head pose and the tracked controller pose, if present. For more information about tracking controllers, see Map Controllers for more information on tracking controllers).

If Tracking Origin Type is set to Floor Level, OVRManager.display.RecenterPose() resets the x-, y-, and z-axis position to origin. If it is set to Eye Level, the x-, y-, and z-axis are all reset to origin, with the y-value corresponding to the height calibration performed with Oculus Configuration Utility. In both cases, the y rotation is reset to 0, but the x and z rotation are unchanged to maintain a consistent ground plane.

  • Late Controller Update: Select this option to update the pose of the controllers immediately before rendering for lower latency between real-world and virtual controller movement. If controller poses are used for simulation/physics, the position may be slightly behind the position used for rendering (~10ms). Any calculations done at simulation time may not exactly match the controller’s rendered position.


Remaster your app by setting the specific color space at runtime to overcome the color variation that may occur due to different color spaces in use.

  • From the Color Gamut list, select the specific color space. For more information about the available color gamut primaries, go to the Set Specific Color Space topic.

Quest Features:

There are certain settings that are applicable to Oculus Quest only.

  • Focus Aware: Select this option to allow users to access system UI without context switching away from the app. For more information about enabling focus awareness, go to the Enable Focus Awareness for System Overlays topic.
  • Hand Tracking Support: From the list, select the type of input affordance for your app. For more information about setting up hand tracking, go to the Set Up Hand Tracking topic.
  • Hand Tracking Frequency: From the list, select the hand tracking frequency. A higher frequency allows for better gesture detection and lower latencies but reserves some performance headroom from the application’s budget. For more information, go to the Set High Frequency Hand Tracking section.
  • Requires System Keyboard: Select this option to allow users to interact with a system keyboard. For more information, go to the Enable Keyboard Overlay in Unity topic.
  • System Splash Screen: Click Select to open a list of 2D textures and select the image you want to set as the splash screen.
  • Allow Optional 3DoF Head Tracking: Select this option to support 3DoF along with 6DoF and let the app run without head tracking, for example, under low-lighting mode. When your app supports 3DoF, Oculus automatically sets the headtracking value to false in the Android Manifest by changing . When the checkbox is not selected, in other words the app supports only 6DoF, the headtracking value is set to true.

Android Build Settings

The shader stripping feature lets you skip unused shaders from compilation to significantly reduce the player build time. Select Skip Unneeded Shaders to enable shader stripping. For more information about understanding different tiers and stripping shaders, go to the Strip Unused Shaders topic.


  • Custom Security XML Path: If you don’t want Oculus to generate a security XML and instead use your own XMl, specify the XML file path.
  • Disable Backups: Select this option to ensure private user information is not inadvertently exposed to unauthorized parties or insecure locations. It adds the allowBackup="false" flag in the AndroidManifest.xml file.
  • Enable NSC Configuration: Select this option to prevent the app or any embedded SDK from initiating cleartext HTTP connections and force the app to use HTTPS encyrption.

Mixed Reality Capture

Mixed Reality Capture (MRC) places real-world objects in VR. In other words, it combines images from the real world with the virtual one. To enable the mixed reality support, select Show Properties, and then select enableMixedReality. For more information about setting up mixed reality capture, go to the Unity Mixed Reality Capture guide.