The site has a new content architecture. We've added the ability to select your development device to show device-specific content. Please read our blog post Oculus Developer Center Update: Device-centric Documentation Architecture for more information.
All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.
This guide describes how to add and configure mixed reality capture support for your Unreal application. Mixed reality capture is supported for Rift applications only.
Please note that Mixed Reality Capture is only supported on the original configuration Oculus Rift and Touch with external sensors.
Mixed reality capture places real-world people and objects in VR. It allows live video footage of a Rift user to be composited with the output from a game to create combined video that shows the player in a virtual scene.
(Courtesy of Medium and artist Dominic Qwek - https://www.oculus.com/medium/)
Live video footage may be captured with a stationary or tracked camera. For more information and complete setup instructions, see the Mixed Reality Capture Setup Guide.
Mixed Reality Capture can be added to an application by simply including the OculusVR plugin, but finer-grained control of the feature can be enabled through the Blueprint Interface. Even when using the default Mixed Reality Capture implementation, developers are encouraged to test applications in mixed reality mode to ensure compatibility and make necessary adjustments using the blueprint interface if needed. See “Blueprint Reference” below for more information.
Users can launch applications with the feature enabled and control several relevant settings with settings controlled by Engine.ini or command-line parameters. See the “Launch Commands” and “Config” section below for more information.
Mixed Reality Capture is available in Unreal versions that use Oculus OVRPlugin 1.31 or later through the Oculus GitHub repository. An older Mixed Reality Capture interface is available from Oculus OVRPlugin 1.16 to 1.30. For more information, see Unreal/Oculus SDK Version Compatibility.
Mixed reality capture supports two methods for combining application output and video footage: direct composition and external composition. Both of these modes use the spectator screen to display the output
In direct composition mode, your mixed reality capture application streams the real-world footage from your camera to your scene directly, and displays the composited image in the application itself. Direct composition mode requires the use of a green screen for video capture, and the composited image may exhibit some latency from the video stream. We currently recommend using it for proof-of-concept, troubleshooting, and hobby use.
For more polished composition, we recommend using external composition mode. In this mode, the application outputs two windows. The first window displays the entire rendered application. The second window displays the foreground content from the video stream on the left against a green background, and it displays the background content on the right.
In external composition mode, third-party composition software such as OBS Studio or XSplit is required to clip the green screen, combine the images, and compensate for camera latency.
For more information on how to composite a scene, see the Mixed Reality Capture Setup Guide.
You must run the CameraTool prior to launching your mixed reality capture application to configure the external camera and VR Object. See the Mixed Reality Capture Setup Guide for setup information.
You may change any local Mixed Reality Capture settings by specifying launch settings in the [Oculus.Settings.MixedReality] section in your Saved/Config/Windows/Engine.ini file. These settings are automatically read and loaded when mixed reality capture starts. If a setting is not defined in the Engine.ini file, the default value will be written to the file on start. Users may edit the Engine.ini file to change these settings later.
The settings available in the config match the Properties in Oculus MR Settings (except bIsCasting). See the “Blueprint Reference” section below for the full list.
By default, the MR capture can be enabled with the
-mixedreality command line parameter when running the game (camera calibration required, see the “Preparation” section above). This changes the spectator screen of the VR app to display a third-person view of the scene from the perspective of the calibrated camera present in OVR server. Settings are automatically loaded from the [Oculus.Settings.MixedReality] section in the Saved/Config/Windows/Engine.ini file.
The composition method is loaded from the Engine.ini file (defaulting to External Composition if nothing is defined), but that can be overridden on a session-by-session basis with the
-directcomposition command line params for convenience. These command line params will not be saved to the config by default.
Mixed Reality Capture can also be enabled in the “VR Preview” Play-In-Editor mode by running the editor with the same commands.
The following features work for direct composition:
The Blueprints functions related to MRC include:
A sample map with mixed reality capture enabled is available in the private Oculus Unreal GitHub repository (access instructions here) in Samples/Oculus/MixedRealitySample. This map uses the blueprint interface to create an in-game menu for modifying the Oculus MR Settings (Press the ‘B’ button to open the menu and ‘A’ button to select options).
The following Blueprint script is taken from the sample scene:
For information regarding MRC console commands, see Console Variables and Commands Reference.