Oculus Rift: Mixed Reality Capture

This guide describes how to add and configure mixed reality capture support for your Unreal application. Mixed reality capture is supported for Rift applications only.

Introduction

Mixed reality capture places real-world people and objects in VR. It allows live video footage of a Rift user to be composited with the output from a game to create combined video that shows the player in a virtual scene.

(Courtesy of Medium and artist Dominic Qwek - https://www.oculus.com/medium/)

Live video footage may be captured with a stationary or tracked camera. For more information and complete setup instructions, see the Mixed Reality Capture Setup Guide.

Mixed Reality Capture can be added to an application by simply including the OculusVR plugin, but finer-grained control of the feature can be enabled through the Blueprint Interface. Even when using the default Mixed Reality Capture implementation, developers are encouraged to test applications in mixed reality mode to ensure compatibility and make necessary adjustments using the blueprint interface if needed. See “Blueprint Reference” below for more information.

Users can launch applications with the feature enabled and control several relevant settings with settings controlled by Engine.ini or command-line parameters. See the "Launch Commands" and “Config” section below for more information.

Mixed Reality Capture is available in Unreal versions that use Oculus OVRPlugin 1.31 or later through the Oculus GitHub repository. An older Mixed Reality Capture interface is available from Oculus OVRPlugin 1.16 to 1.30. For more information, see Unreal/Oculus SDK Version Compatibility.

Compositing the Scene

Mixed reality capture supports two methods for combining application output and video footage: direct composition and external composition. Both of these modes use the spectator screen to display the output

In direct composition mode, your mixed reality capture application streams the real-world footage from your camera to your scene directly, and displays the composited image in the application itself. Direct composition mode requires the use of a green screen for video capture, and the composited image may exhibit some latency from the video stream. We currently recommend using it for proof-of-concept, troubleshooting, and hobby use.

For more polished composition, we recommend using external composition mode. In this mode, the application outputs two windows. The first window displays the entire rendered application. The second window displays the foreground content from the video stream on the left against a green background, and it displays the background content on the right.

External Composition Mode

In external composition mode, third-party composition software such as OBS Studio or XSplit is required to clip the green screen, combine the images, and compensate for camera latency.

For more information on how to composite a scene, see the Mixed Reality Capture Setup Guide.

Preparation

You must run the CameraTool prior to launching your mixed reality capture application to configure the external camera and VR Object. See the Mixed Reality Capture Setup Guide for setup information.

Configuration

You may change any local Mixed Reality Capture settings by specifying launch settings in the [Oculus.Settings.MixedReality] section in your Saved/Config/Windows/Engine.ini file. These settings are automatically read and loaded when mixed reality capture starts. If a setting is not defined in the Engine.ini file, the default value will be written to the file on start. Users may edit the Engine.ini file to change these settings later.

The settings available in the config match the Properties in Oculus MR Settings (except bIsCasting). See the “Blueprint Reference” section below for the full list.

Launch Commands

By default, the MR capture can be enabled with the -mixedreality command line parameter when running the game (camera calibration required, see the “Preparation” section above). This changes the spectator screen of the VR app to display a third-person view of the scene from the perspective of the calibrated camera present in OVR server. Settings are automatically loaded from the [Oculus.Settings.MixedReality] section in the Saved/Config/Windows/Engine.ini file.

The composition method is loaded from the Engine.ini file (defaulting to External Composition if nothing is defined), but that can be overridden on a session-by-session basis with the -externalcomposition or -directcomposition command line params for convenience. These command line params will not be saved to the config by default.

Examples:

// Launch with settings from Engine.ini
MixedRealitySample.exe -vr -mixedreality
// Launch in external composition mode
MixedRealitySample.exe -vr -mixedreality -externalcomposition
// Launch in direct composition mode
MixedRealitySample.exe -vr -mixedreality -directcomposition

Mixed Reality Capture can also be enabled in the “VR Preview” Play-In-Editor mode by running the editor with the same commands.

Mixed Reality Capture Features

The following features work for direct composition:

  • Chroma Key: Chroma key settings allow for fine-tuned control of how the video and application streams are composited. Use these settings to set the reference color of the green screen and control various thresholds at which video pixels are included or excluded from the final frame.
  • Dynamic Lighting: When Dynamic Lighting is enabled, video captured by the physical camera is illuminated in the composted scene by light effects and flashes within the application. For example, a player would briefly be brightly lit during an explosion in the game. Lighting is applied to video on a flat plane parallel to the camera unless a depth-sensing camera is used (ZED camera), in which case pixel depth is used to generate a per-pixel normal which is used in the lighting process.
  • Virtual Green Screen: When enabled, Virtual Green Screen crops video footage that falls outside of the Guardian System Outer Boundary or Play Area configured by the user. The Outer Boundary is the actual perimeter drawn by the user during Touch setup, while the Play Area is a rectangle calculated from the Outer Boundary. Note that the Outer Boundary and Play Area are two-dimensional shapes in the x and z axis. Note that the Outer Boundary and Play Area are two-dimensional shapes in the x and z axis, and the virtual green screen is a 3D volume whose caps are set at +/- 10 meters by default.

MRC Blueprints

The Blueprints functions related to MRC include:

Sample Scene

A sample map with mixed reality capture enabled is available in the private Oculus Unreal GitHub repository (access instructions here) in Samples/Oculus/MixedRealitySample. This map uses the blueprint interface to create an in-game menu for modifying the Oculus MR Settings (Press the 'B' button to open the menu and 'A' button to select options).

The following Blueprint script is taken from the sample scene (CLICK TO ENLARGE):

MRC Consolve Commands

For information regarding MRC console commands, see Console Variables and Commands Reference.