Unreal Mixed Reality Capture

This guide describes how to add and configure mixed reality capture support for your Unreal application. Mixed reality capture is supported for Rift applications only.

Introduction

Mixed reality capture places real-world people and objects in VR. It allows live video footage of a Rift user to be composited with the output from a game to create combined video that shows the player in a virtual scene.

(Courtesy of Medium and artist Dominic Qwek - https://www.oculus.com/medium/)

Live video footage may be captured with a stationary or tracked camera. For more information and complete setup instructions, see the Mixed Reality Capture Setup Guide.

Once configured to use mixed reality capture, applications can be launched with the feature enabled by running them with the appropriate parameter - see "Command-line Parameter Reference" below for more information.

Compositing the Scene

Mixed reality capture supports two methods for combining application output and video footage: direct composition and external composition.

In direct composition mode, your mixed reality capture application streams the real-world footage from your camera to your scene directly, and displays the composited image in the application itself. Direct composition mode requires the use of a green screen for video capture, and the composited image may exhibit some latency from the video stream. We currently recommend using it for proof-of-concept, troubleshooting, and hobby use.

For more polished composition, we recommend using external composition mode. In this mode, the application outputs two windows. The MirrorWindow displays the application, as shown below. The second window displays the foreground content from the video stream on the left against a green background, and it displays the background content on the right.

External Composition Mode

In external composition mode, third-party composition software such as OBS Studio or XSplit is required to clip the green screen, combine the images, and compensate for camera latency.

For more information on how to composite a scene, see the Mixed Reality Capture Setup Guide.

Preparation

You must run the CameraTool prior to launching your mixed reality capture application to configure the external camera and VR Object. See the Mixed Reality Capture Setup Guide for setup information.

Create Simple VR Application (Optional)

If you wish to experiment with this feature but do not have a VR application prepared, create a basic application with the following steps.

  1. Create a new project with the Virtual Reality Blueprint template.
  2. Open the MotionControllerMap in Content > VirtualRealityBP > Maps.
  3. Configure the Virtual Reality Blueprint for mixed reality capture.
    1. (Optional) To be able to use a Touch controller as an input device with the Virtual Reality Blueprint, open the Project Settings menu, select Engine - Input and expand Action Mappings. Modify the settings as shown below.
    2. (Optional) Open Edit Preference in the General - Miscellaneous panel and uncheck Use Less CPU when in Background. This prevents applications with mixed reality capture enabled from entering a low-FPS mode when switching to the composition software.
    3. The MotionControllerPawn Blueprint contains a bug which sets the Tracking Origin to "Eye Level" incorrectly. Add a link between "Default" and "Set Tracking Origin (Floor Level) as shown:

Add Mixed Reality Capture Support

To enable mixed reality capture, you will add a camera actor to your map, and set it as the origin for your tracking space (optional). See the "Blueprint Reference" below for detailed information on the various components and settings.

  1. In the Level editor:
    1. Add an instance of "Oculus MR Casting Camera Actor" to the map, which will be used in mixed reality capture. The composition parameters for this view may be set in the Oculus MR section of the OculusMR_CastingCameraActor instance.
  2. In the Level Blueprint editor:
    1. Set VRPawn’s VROrigin (or the tracking origin component in your map) as the OculusMR_CastingCameraActor1’s TrackingReferenceComponent. If you plan to use the first PlayerController’s position as the tracking reference, you may skip this step. See "Blueprint Reference" below for more information.
    2. Bind the OculusMR_CastingCameraActor1 to the first TrackedCamera, which was configured with the CameraTool. The final Blueprint should looks like this:
  3. (Optional) Check the Casting Auto Start checkbox in the Oculus MR section of OculusMR_CastingCameraActor1 to configure the engine to automatically open the Casting Window on launch. This option is useful for debugging.

To check that everything is working properly, launch the map in VR Preview mode and verify that the Casting Window opens with the mixed reality capture content.

Sample Scene

A trivial sample map with mixed reality capture enabled is available in our GitHub repository (access instructions here) in Samples/Oculus/MixedRealitySample.

Launch Commands

Once you package the sample scene, you may launch it with these parameters.

You may set your application to launch with mixed reality capture enabled in Blueprints for debugging purposes only. The Blueprint setting bCastingAutoStart is automatically disabled when you build your package.

// launch in direct composition mode
MixedRealitySample.exe -vr -mxr_open_direct_composition 	

// launch in direct composition mode with MirrorWindow projection
MixedRealitySample.exe -vr -mxr_open_direct_composition -mxr_project_to_mirror_window 

// launch in MultiView mode
MixedRealitySample.exe -vr -mxr_open_multiview 

// launch in MultiView with MirrorWindow projection
MixedRealitySample.exe -vr -mxr_open_multiview -mxr_project_to_mirror_window

Blueprint Reference

AOculusMR_CastingCameraActor

Properties

Property

Description

bCastingAutoStart

Starts mixed reality capture casting automatically when the level starts. This option is for debugging only, and will be automatically disabled when the game is launched as a standalone package. Use launch commands to launch applications with mixed reality capture enabled.

bProjectToMirrorWindow

Set to true to cast to the MirrorWindow. This can simplify window switching, especially on a single-monitor configuration. By default the scene is casted to a standalone window, which offers the most precision in the composition.

When set to true, the bProjectToMirrorWindow is automatically minimized on startup, as the content is now casting to the MirrorWindow.

CompositionMethod

MultiView (default): The casting window includes the background and foreground view for external composition.

DirectComposition: The game scene is composited with the camera frame directly.

ClippingReference

Specifies the distance from the camera to the mixed reality capture casting background/foreground boundary. Set to CR_TrackingReference to use the distance to the Tracking Reference (recommended for stationary experiences). Set to CR_Head to use the distance to the HMD (default, recommended for roomscale experiences).

TrackedCamera

Information about the tracked camera which this object is bound to.

TrackingReferenceComponent

(optional) If the application uses a VROrigin component to set the tracking space origin, specify that component here. Otherwise the system will use the location of the first PlayerController as the tracking reference.

bFollowTrackingReference

If true the casting camera will automatically follow the movement of the tracking reference.

bUseTrackedCameraResolution

If true the casting viewports will use the same resolution as the camera used in the calibration process.

WidthPerView

When bUseTrackedCameraResolution is false, sets the width of each casting viewport (foreground, background, or direct composited).

HeightPerView

When bUseTrackedCameraResolution is false, sets the height of each casting viewport (foreground, background, or direct composited).

CapturingCamera

When CompositionMethod is set to DirectComposition, indicates which physical camera device provides the video frame for compositing

CastingLatency

When CompositionMethod is set to MultiView, sets the latency of the casting output. This setting may be used to help sync with the camera latency in the external composition application.

ChromaTorelanceA

[Green-screen removal] When CompositionMethod is set to DirectComposition, sets how heavily to weight non-green values in a pixel. For example, if the character image looks too transparent, you may increase this value to make it more opaque.

ChromaTorelanceB

[Green-screen removal] When CompositionMethod is set to DirectComposition, sets how heavily to weight the green value. If mid-range greens don’t appear to be cut out, increasing B or decreasing A may help.

ChromaShadows

[Green-screen removal] When CompositionMethod is set to DirectComposition, the shadow threshold helps mitigate shadow casting issues by eliminating very dark pixels.

ChromaAlphaCutoff

[Green-screen removal] When CompositionMethod is DirectComposition, alpha cutoff is evaluated after chroma-key evaluation and before the bleed test to fully discard pixels with a low alpha value.

Methods

Method

Description

BindToTrackedCameraIndexIfAvailable

Binds the casting camera to the calibrated external camera.

If there is no calibrated external camera, the TrackedCamera parameters must be set up to match CastingCameraActor placement. It provides an easy way to directly place a stationary casting camera in the level.

RequestTrackedCameraCalibration

When bFollowTrackingReference is false, manually call this method to move the casting camera to follow the tracking reference (i.e., the player).

OpenCastingWindow

Opens the casting window.

CloseCastingWindow

Closes the casting window.

ToggleCastingWindow

Toggles the casting window.

HasCastingWindowOpened

Checks if the casting window has already been opened.

FTrackedCamera

Properties

Property

Description

Index

>=0: the index of the external camera.

-1: Do not bind to any external camera (i.e., set up to match the manual CastingCameraActor placement).

Name

The external camera name set through the CameraTool.

FieldOfView

Horizontal FOV, in degrees.

SizeX, SizeY

Resolution of the camera frame.

AttachedTrackedDevice

The tracking node the external camera is bound to:

  • None: stationary camera
  • HMD, LTouch, RTouch: HMD or left/right Touch
  • DeviceObjectZero: The VR object

CalibratedRotation, CalibratedOffset

The relative pose of the camera to the attached tracking device.

Equals the absolute pose in the tracking space if AttachedTrackedDevice==None.

UserRotation. UserOffset

(optional) Provide user pose to fine tune the relative camera pose at the run-time.

Use to match the manual CastingCameraActor placement in the level when Index == -1.

UOculusMRFunctionLibrary

Methods

Method

Description

GetAllTrackedCamera

Retrieve an array of all tracked cameras which were calibrated through the CameraTool.

Command-line Parameter Reference

Parameter

Description

-mxr_open

Automatically open the casting window in the preset composition mode

-mxr_open_multiview

Automatically open the casting window in MultiView composition mode.

-mxr_open_direct_composition

Automatically open the casting window in DirectCompositon mode.

-mxr_project_to_mirror_window

Project the casting output to the MirrorWindow.

Console commands

Command

Description

mr.AutoOpenCasting [0|1|2]

Automatically open the casting window: 0=Off; 1=MultiView; 2=DirectComposition

mr.ChromaTorelanceA <float>

[Green-screen removal] When CompositionMethod is set to DirectComposition, sets how heavily to weight non-green values in a pixel. For example, if the character image looks too transparent, you may increase this value to make it more opaque.

mr.ChromaTorelanceB <float>

[Green-screen removal] When CompositionMethod is set to DirectComposition, sets how heavily to weight the green value. If mid-range greens don’t appear to be cut out, increasing B or decreasing A may help.

mr.ChromaShadows <float>

[Green-screen removal] When CompositionMethod is set to DirectComposition, the shadow threshold helps mitigate shadow casting issues by eliminating very dark pixels.

mr.ChromaAlphaCutoff <float>

[Green-screen removal] When CompositionMethod is DirectComposition, alpha cutoff is evaluated after chroma-key evaluation and before the bleed test to fully discard pixels with a low alpha value.

mr.CastingLantency <float>

The casting latency in MultiView mode.

mr.ProjectToMirrorWindow [0|1]

Project the casting output to the MirrorWindow.