Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

Mixed Reality Capture

This guide describes how to add and configure mixed reality capture support for your Unreal application. Mixed reality capture is supported for Rift and Quest applications.

Introduction

Mixed reality capture places real-world people and objects in VR. It allows live video footage of an Oculus user to be composited with the output from a game to create combined video that shows the player in a virtual scene. Mixed-reality capture can be a helpful tool for marketing your game. Following is a screen capture from a mixed-reality capture video.

(Courtesy of Medium and artist Dominic Qwek - https://www.oculus.com/medium/)

When you create a mixed-reality capture, live video footage is captured with a stationary or tracked camera.

Mixed-Reality Documentation

There is detailed documentation for mixed reality capture, depending on your target device.

Mixed Reality Capture with Unreal

Mixed Reality Capture can be added to an application by including the OculusVR plugin, but finer-grained control of the feature can be enabled through the Blueprint Interface. Even when you use the default Mixed Reality Capture implementation, you are encouraged to test applications in mixed reality mode to ensure compatibility and make necessary adjustments using the blueprint interface, if needed. See the Blueprint Reference section that follows for more information.

Users can launch applications with the feature enabled and control several relevant settings with settings controlled by Engine.ini or command-line parameters. For more information, see the Launch Commands and Configuration sections that follow.

Mixed Reality Capture is available in Unreal versions that use Oculus OVRPlugin 1.31 or later, which means you should use Unreal version 4.22 or later. For more information, see the Version Compatibility Doc.

Camera Calibration

You must run the Camera Calibration Tool prior to launching your mixed reality capture application to configure the external camera and VR Object.

Compositing the Scene

Mixed reality capture supports two methods for combining application output and video footage:

  • external composition, which is supported for Oculus Rift and Quest
  • direct composition, which you can use only on a Rift device

Both of these modes use the spectator screen to display the output. For more information on how to composite a scene, see:

External Composition Mode

You can use the default external composition mode on Quest and on Rift. External composition provides the more polished composition. In this mode, the application outputs two windows. The first window displays the entire rendered application. The second window displays the foreground content from the video stream on the left against a green background, and it displays the background content on the right.

In external-composition mode, third-party composition software such as OBS Studio or XSplit is required to clip the green screen, combine the images, and compensate for camera latency.

The following image shows an example of the second window when using external composition.

Direct Composition Features

On Rift you can use direct composition mode, which means your mixed reality capture application streams the real-world footage from your camera to your scene directly, and displays the composited image in the application itself. Direct composition mode requires the use of a green screen for video capture, and the composited image may exhibit some latency from the video stream. We currently recommend using it for proof-of-concept, troubleshooting, and hobby use. The following work for direct composition:

  • Chroma Key: Chroma key settings allow for fine-tuned control of how the video and application streams are composited. Use these settings to set the reference color of the green screen and control various thresholds at which video pixels are included or excluded from the final frame.
  • Dynamic Lighting: When Dynamic Lighting is enabled, video captured by the physical camera is illuminated in the composted scene by light effects and flashes within the application. For example, a player would briefly be brightly lit during an explosion in the game. Lighting is applied to video on a flat plane parallel to the camera.
  • Virtual Green Screen: When enabled, Virtual Green Screen crops video footage that falls outside of the Guardian System Outer Boundary or Play Area configured by the user. The Outer Boundary is the actual perimeter drawn by the user during Touch setup, while the Play Area is a rectangle calculated from the Outer Boundary. Note that the Outer Boundary and Play Area are two-dimensional shapes in the x and z axis. Note that the Outer Boundary and Play Area are two-dimensional shapes in the x and z axis, and the virtual green screen is a 3D volume whose caps are set at +/- 10 meters by default.

Configuration in Unreal Engine

For Rift, you can change any local Mixed Reality Capture settings by specifying launch settings in the [Oculus.Settings.MixedReality] section in the Saved/Config/Windows/Engine.ini file.
These settings are automatically read and loaded when mixed reality capture starts. The composition method is loaded from the Engine.ini file, which defaults to external composition if not defined. If a setting is not defined in the Engine.ini file, the default value will be written to the file on start.

The settings available in the config match the Properties in Oculus MR Settings (except bIsCasting). See the Blueprint Reference section that follows for the full list.

Launch Commands and Specifying Composition Mode

By default, the MR capture can be enabled with the -mixedreality command line parameter when running the game (camera calibration required, see the [Prerequisite] (#prerequisite) section above). This changes the spectator screen of the VR app to display a third-person view of the scene from the perspective of the calibrated camera present in OVR server. Settings are automatically loaded from the [Oculus.Settings.MixedReality] section in the Saved/Config/Windows/Engine.ini file.

The composition method is loaded from the Engine.ini file, which defaults to external composition if not defined, but that can be overridden on a session-by-session basis with the -externalcomposition or -directcomposition (Rift only) command line params for convenience. These command line params will not be saved to the config by default.

Mixed Reality Capture can also be enabled in the VR Preview Play-In-Editor mode by running the editor with the same commands. For more information on the console commands, see Console Variables and Command Reference.

MRC Console Commands

For a list of MRC console commands, see Console Variables and Command Reference.

MRC Blueprints

The Blueprints functions related to MRC include:

Sample Scene

A sample map with mixed reality capture enabled is available in the private Oculus Unreal GitHub repository under the Samples/Oculus/MixedRealitySample folder . For more information on how to access, see the Version Compatibility Doc.

This map uses the blueprint interface to create an in-game menu for modifying the Oculus MR Settings.

The following Blueprint script is taken from the sample scene:

Click the MixedRealitySample.uproject file to open the project in Unreal Engine. One the project loads you run the sample. When the sample is running, from a controller:

  • Press the ‘B’ button to open the menu
  • Press the ‘A’ button to select options