The Oculus Integration package contains a prefab, OVRCameraRig, which provides the transform object to represent the Oculus tracking space. Under the tracking space, there is a custom VR camera, which replaces Unity’s conventional camera. OVRCameraRig also provides access to OVRManager, which is the main interface to the VR hardware.
The OVRCameraRig prefab contains a tracking space game object to fine-tune the relationship between the head tracking reference frame and your world. Under the tracking space object, you will find a center eye anchor, which is the main Unity camera, two anchor game objects for each eye, and left and right hand anchors for controllers.
When you enable VR support in Unity, your head-mounted device automatically passes the head and positional tracking reference to Unity. This lets the camera’s position and orientation finely match with the user’s position and orientation in the real world. The head-tracked pose values overrides the camera’s transform values, which means the camera is always in a position relative to the player object.
In a typical first- or third-person setup, instead of having a stationary camera, you may want the camera to follow or track the player object. The player object can be a character in motion, such as an avatar, car, or a gun turret. To move the camera to follow the player object, you can either make the camera a child of the player object or have an object track the player, and the camera, in turn, follows that object. Based on your app design, you may want to create a script that references the player object and attach the script to OVRCameraRig.
OVRCameraRig is a replacement to Unity’s main camera, which means you can safely delete Unity’s main camera from the Hierarchy view. The primary benefit of using OVRCameraRig is that it provides access to OVRManager, which provides the main interface to the VR hardware. Before you add OVRCameraRig, make sure you follow the pre-requisites.
There are two main scripts attached to the OVRCameraRig prefab: OVRCameraRig.cs and OVRManager.cs. Each script provides settings for camera, display, tracking, quality, and performance of your app.
In Hierarchy View, select the OVRCameraRig prefab, and in the Inspector View, review the following settings under OVRCameraRig.cs and OVRManager.cs scripts.
OVRCameraRig.cs is a component that controls stereo rendering and head tracking. It maintains three child anchor transforms at the poses of the left and right eyes, as well as a virtual center eye that is halfway between them. It is the main interface between Unity and the cameras. and attached to a prefab that makes it easy to add comfortable VR support to a scene.
Note: All camera control should be done through this component.
FixedUpdate()
method instead of Update()
method to favor physics fidelity. However, if the fixed update rate doesn’t match the rendering frame rate, which is derived by using OVRManager.display.appFramerate
, the anchors visibly judder.Disable Eye Anchor Cameras: Select this option to disable the cameras on the eye anchors. In this case, the main camera of the game is used to provide the VR rendering and the tracking space anchors are updated to provide reference poses.
OVRManager.cs is the main interface to the VR hardware and is added to the OVRCameraRig prefab. It is a singleton that exposes the Oculus SDK to Unity, and includes helper functions that use the stored Oculus variables to help configure the camera behavior. It can be a part of any app object and should only be declared once.
Target Devices:
All apps that target Oculus Quest are automatically compatible to run on Oculus Quest 2. However, when you query the headset type that the app is running on, Oculus returns Oculus Quest even if the headset is Oculus Quest 2 for the best compatibility. If you precisely want to identify the headset type, select Oculus Quest and Oculus Quest 2 as target devices. In this case, when you query the headset type, Oculus returns the exact headset that the app is running on. Based on the target headset, Oculus automatically adds the <meta-data android:name="com.oculus.supportedDevices" android:value="quest" />
element for Oculus Quest, and the <meta-data android:name="com.oculus.supportedDevices" android:value="quest|quest2" />
element for both Oculus Quest and Oculus Quest 2 in the Android Manifest file. There is no need to update the Android Manifest file manually.
To select the target device, select the relevant checkbox.
When you create apps that target both, Oculus Quest and Oculus Quest 2, you can check the headset type to optimize the app and improve the user experience. Call the OVRManager.systemHeadsetType()
to return the headset type that the app is running on. For example, the method returns Oculus_Quest
or Oculus_Quest_2
depending on the headset type. The OVRManager.SystemHeadsetType
enum lists the headset types that the OVRManager.systemHeadsetType()
method returns.
Performance and Quality:
Use Recommended MSAA Level: True, by default. Select this option to let OVRManager automatically choose the appropriate MSAA level based on the Oculus device. For example, for Oculus Quest, the MSAA level is set to 4x. Currently supported only for Unity’s built-in render pipeline.
Note: For Universal Render Pipeline (URP), you need to manually set the MSAA level to 4x. We are aware of the issue that URP does not set the MSAA level automatically. We will announce the fix in the Release Notes page.
Monoscopic: If true, both eyes see the same image, rendered from the center eye pose, saving performance on low-end devices. We do not recommend using this setting as it doesn’t provide the correct experience in VR.
Profiler TCP Port: The TCP listening port of Oculus Profiler Service, which is activated in debug or development builds. When the app is running on editor or device, go to Tools > Oculus > Oculus Profiler Panel to view the real-time system metrics.
Tracking:
Tracking Origin Type: Sets the tracking origin type.
Eye Level tracks the position and orientation relative to the device’s position.
Floor Level tracks the position and orientation relative to the floor, based on the user’s standing height as specified in the Oculus Configuration Utility.
For Oculus Rift, OVRManager.display.RecenterPose()
recenters the head pose and the tracked controller pose, if present (see OVRInput for more information on tracking controllers).
If Tracking Origin Type
is set to Floor Level
, OVRManager.display.RecenterPose()
resets the x-, y-, and z-axis position to origin. If it is set to Eye Level, the x-, y-, and z-axis are all reset to origin, with the y-value corresponding to the height calibration performed with Oculus Configuration Utility. In both cases, the y rotation is reset to 0, but the x and z rotation are unchanged to maintain a consistent ground plane.
Late Controller Update: Select this option to update the pose of the controllers immediately before rendering for lower latency between real-world and virtual controller movement. If controller poses are used for simulation/physics, the position may be slightly behind the position used for rendering (~10ms). Any calculations done at simulation time may not exactly match the controller’s rendered position.
Display:
You can remaster your app by setting the specific color space at runtime for your Oculus device to overcome the color variation that may occur due to different color spaces in use.
Select Enable Specific Color Gamut to set the specific color space. For more information about the available color gamut primaries, go to the Set Specific Color Space topic.
Quest Features:
There are certain settings that are applicable to Oculus Quest only.
Hand Tracking Support: From the list, select the type of input affordance for your app. For example, Controllers only, Controllers and Hands, or Hands only. For more information about setting up hand tracking, go to the Set Up Hand Tracking topic.
Android Build Settings:
The shader stripping feature lets you skip unused shaders from compilation to significantly reduce the player build time. Select Skip Unneeded Shaders to enable shader stripping. For more information about understanding different tiers and stripping shaders, go to the Strip Unused Shaders topic.
Security:
allowBackup="false"
flag in the AndroidManifest.xml file.Mixed Reality Capture:
Mixed Reality Capture (MRC) places real-world objects in VR. In other words, it combines images from the real world with the virtual one. To enable the mixed reality support, select Show Properties, and then select enableMixedReality. For more information about setting up mixed reality capture, go to the Unity Mixed Reality Capture guide.