All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.
This guide describes how to add and configure mixed reality capture support for your Unreal application. Mixed reality capture is supported for Rift and Quest applications.
For more information on Mixed Reality Capture, how to design your app to enhance the specator experience, and produce high quality, MRC video assets check out the Mixed Reality Capture Best Practice Guide.
Mixed reality capture places real-world people and objects in VR. It allows live video footage of an Oculus user to be composited with the output from a game to create combined video that shows the player in a virtual scene. Mixed-reality capture can be a helpful tool for marketing your game. Following is a screen capture from a mixed-reality capture video.
(Courtesy of Medium and artist Dominic Qwek - https://www.oculus.com/medium/)
When you create a mixed-reality capture, live video footage is captured with a stationary or tracked camera.
See the support pages below for detailed steps on how to get started with mixed reality capture, as well as the following video walkthrough:
Mixed Reality Capture can be added to an application by including the OculusVR plugin, but finer-grained control of the feature can be enabled through the Blueprint Interface. Even when you use the default Mixed Reality Capture implementation, you are encouraged to test applications in mixed reality mode to ensure compatibility and make necessary adjustments using the blueprint interface, if needed. See the Blueprint Reference section that follows for more information.
Users can launch applications with the feature enabled and control several relevant settings with settings controlled by
Engine.ini or command-line parameters. For more information, see the Launch Commands and Configuration sections that follow.
Mixed Reality Capture is available in Unreal versions that use Oculus OVRPlugin 1.31 or later, which means you should use Unreal version 4.22 or later. For more information, see the Version Compatibility Doc.
Camera calibration: You must run the Camera Calibration Tool prior to launching your mixed reality capture application to configure the external camera and VR Object.
Compositing the scene: Mixed reality capture supports two methods for combining application output and video footage:
Both of these modes use the spectator screen to display the output.
You can use the default external composition mode on Quest and on Rift. External composition provides the more polished composition. In this mode, the application outputs two windows. The first window displays the entire rendered application. The second window displays the foreground content from the video stream on the left against a green background, and it displays the background content on the right.
In external-composition mode, third-party composition software such as OBS Studio or XSplit is required to clip the green screen, combine the images, and compensate for camera latency. See the video below for how to set up OBS with your Oculus Quest.
The following image shows an example of the second window when using external composition.
On Rift you can use direct composition mode, which means your mixed reality capture application streams the real-world footage from your camera to your scene directly, and displays the composited image in the application itself. Direct composition mode requires the use of a green screen for video capture, and the composited image may exhibit some latency from the video stream. We currently recommend using it for proof-of-concept, troubleshooting, and hobby use. The following work for direct composition:
For Rift, you can change any local Mixed Reality Capture settings by specifying launch settings in the
[Oculus.Settings.MixedReality] section in the
Saved/Config/Windows/Engine.ini file. These settings are automatically read and loaded when mixed reality capture starts. The composition method is loaded from the
Engine.ini file, which defaults to external composition if not defined. If a setting is not defined in the
Engine.ini file, the default value will be written to the file on start.
The settings available in the config match the Properties in Oculus MR Settings (except bIsCasting). See the Blueprint Reference section that follows for the full list.
By default, the MR capture can be enabled with the
-mixedreality command line parameter when running the game (camera calibration required, see the [Prerequisite] (#prerequisite) section above). This changes the spectator screen of the VR app to display a third-person view of the scene from the perspective of the calibrated camera present in OVR server. Settings are automatically loaded from the
[Oculus.Settings.MixedReality] section in the
The composition method is loaded from the
Engine.ini file, which defaults to external composition if not defined, but that can be overridden on a session-by-session basis with the
-directcomposition (Rift only) command line params for convenience. These command line params will not be saved to the config by default.
Mixed Reality Capture can also be enabled in the VR Preview Play-In-Editor mode by running the editor with the same commands. For more information on the console commands, see Console Variables and Command Reference.
For a list of MRC console commands, see Console Variables and Command Reference.
The Blueprints functions related to MRC include:
A sample map with mixed reality capture enabled is available in the private Oculus Unreal GitHub repository under the
Samples/Oculus/MixedRealitySample folder . For more information on how to access, see the Version Compatibility Doc.
This map uses the blueprint interface to create an in-game menu for modifying the Oculus MR Settings.
The following Blueprint script is taken from the sample scene:
MixedRealitySample.uproject file to open the project in Unreal Engine. One the project loads you run the sample. When the sample is running, from a controller: