Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

Unreal Samples

Oculus provides samples which illustrate basic VR concepts in Unreal such as hand tracking, haptics, and the Boundary Component API for interacting with the Guardian System.

Samples are available from the Oculus Unreal GitHub repository. To access this repository, you must be subscribed to the private EpicGames/UnrealEngine repository (see https://www.unrealengine.com/ue4-on-github for details). An Unreal license is not required.

Once you have access the repository, all samples are in the Samples/Oculus folder.

All samples require a compatible version of the Unreal Engine which supports the illustrated features. To explore samples, we generally recommend using Unreal versions that we ship from the Oculus GitHub repository, because it includes the latest features. For details about the features added in each release, see Unreal Engine.

This topic contains the following sample descriptions:

Android Permissions

The AndroidPermissions sample shows how to request the necessary Android permissions required for an Oculus app such as Internet, external storage and microphone permissions. Press the buttons in the UI while the app is running to trigger permissions requests.The following image shows blueprint that requests microphone permissions:

Microphone permission request

See the README.md in the sample folder for more details.

Avatar Sample

The Avatar sample shows how use a player and a third-person local avatar.

Boundary/Guardian Sample App

The BoundarySample sample app:

  • Demonstrates how to use Blueprints to access the Oculus Guardian System
  • Demonstrates how to position, scale, and rotate objects with respect to the Play Area
  • Demonstrates how to determine when the Guardian boundary is crossed by the headset or either controller

This sample app only works with 6DOF (six degrees of freedom) headsets such as Oculus Rift and Oculus Quest. For more a detailed description of this sample, see Boundary Sample.

Cloud Saves and Downloadable Content (DLC) (new for v19)

This sample shows how to use the:

Check out the DLCWidget class for the DLC implementation, and the UCloudSaveWidget for the cloud storage implementation. The ReadMe contains full description of the sample.

Note: To exercise this sample, you must have Quest Publishing Access.

See the README.md in the sample directory for more details.

Hand Tracking Samples

The Oculus GitHub repo contains two samples that illustrate the Oculus hand tracking feature in Unreal Engine. For these samples to compile and run, you must have the 4.25/v17 release or later version of the Oculus Integration, as this version includes the hand tracking feature.

  • Find the Hand Sample sample under Samples/Oculus/HandSample in the GitHub repo. Open the HandTrackingSample or HandTrackingCustomSample maps to see how hand tracking is implemented.

  • Another hand tracking sample demonstrates hand tracking while interacting with a virtual model train. Find this sample under Samples/Oculus/HandsTrainSample in the GitHub repo. For more information about this sample, see Train Hand Tracking Sample.

Input Sample

The Touch sample illustrates tracking, thumbstick control, and haptics control using the PlayHapticEffect() and PlayHapticSoundWave() functions. When the sample is running, two spheres track with the Touch controllers. The right controller thumbstick may be used to control the position of the light gray box. Touch capacitance sensors detect whether the right thumb is in a touch or near-touch position, which controls the height of the box. Press and hold the left Touch grip button to play a haptics clip. Press and hold the left Touch X button to create a haptics effect by setting the haptics value directly. The following image shows an example:

You will find the Haptics control Blueprint and the Thumbstick control Blueprint in the Touch sample Level Blueprint. NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height, and so forth.

Layer Sample

LayerSample is a Blueprint sample that illustrates the use of VR Compositor Layers to display a UMG UI.

This sample includes two spheres that track with the Touch controllers and two UMG widgets rendered as VR Compositor layers. One is rendered as a quad layer and the other as a cylinder layer.

Actor_Blueprint illustrates rendering a UMG widget into a stereo layer. The widget is first rendered into a Material, then the SlateUI texture is pulled from the Material into the stereo layer. This is the UMG widget that is rendered to the quad and cylindrical layers in the sample.

Open MenuBlueprint to open the UMG widget in the UMG Editor.

NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height.

Locomotion and Interaction Sample (new for v19)

The locomotion and interactions sample showcases a handful of common modes for moving the player around. Also included are some interactable objects, focusing on two-handed manipulation, which can be tricky to get right.

This sample demonstrates six different types of locomotion. The following lists the types of locomotion and how to use use them when the sample is running.

  • Point and Teleport: Use the thumbstick to point to where you want to go. An arc and indicator on the ground shows the destination and orientation after teleporting. Rotate the stick to change your target orientation. When the stick returns to neutral position, you teleport to the destination.
  • Point and Teleport with Third-Person Avatar: Identical to Point and Teleport, except a third-person avatar walks to the destination to demonstrate it. This is useful to force the player to account for world geometry or travel time, while still allowing the comfort of teleportation.
  • Stepped Translation and Rotation: Demonstrates steps by enabling movement forward and backwards at fixed increments with the left thumbstick and turn right and left in 35° intervals with the right thumbstick.
  • Grab and Drag: Hold down the action buttons or triggers and move the controllers over the ground to emulate dragging yourself in a direction. This can also be adapted to handle vertical climbing.
  • Arm Swinging: Hold down the action buttons or triggers and swing your arms to move forward in the direction the controllers are pointing.
  • Dual-Stick Walking: The left stick controls translation and the right stick controls rotation like in a standard first-person game. This type of movement can be uncomfortable in VR, so you should only try it for short periods of time.

See the README.md in the sample directory for more details.

Mixed Reality Capture Sample

A trivial sample map with mixed reality capture enabled is available in MixedRealitySample. Select the OculusMR_CastingCameraActor1 instance to see how the actor is configured for the Level.

For more information, see Mixed Reality Capture.

The following image shows this sample when it is running.

Rendering Techniques (new for v19)

This RenderingTechniques sample app contains several different maps that demonstrate shadow, portal and text rendering techniques. A summary of each map follows, but also see the README.md in the sample directory for more details on each map and how to use it.

Cascaded Shadows Map

This map demonstrates dynamic shadows using the cascaded shadow map feature in Unreal Engine

When the sample is running, use the triggers on the touch controller to drag the sliders in the map. The light’s rotation is controlled by a Blueprint script attached to the light.

Color Grade Map

The ColorGradeMap demonstrates how to use look-up tables (LUTs) to adjust the look of your scene.

The color grading map requires that you have the Unreal source code and that you apply the patch in the ColorGradingLUTPatch directory under the sample folder. When you have applied the patch, you can apply color grading LUTs. To learn more about LUTs for color grading, see Using Lookup Tables (LUTs) for Color Grading in the Unreal documentation.

Distance Field Baked Shadows Map

Distance Field shadows are an Unreal Engine feature for precomputed shadows for stationary objects that are more distant.

For precomputed distance field shadows to work, the light type must be set to stationary and you must enable Support Pre-baked Distance Field Shadows in the Mobile Shader Permutation Reduction section of the Engine > Rendering section of the project settings. The following image shows this setting:

Distance Field Shadows

Portals Map

The PortalsMap demonstrates two different ways of rendering portals in VR.

The first uses a static parallax-corrected cubemap captured at the location of the portal being looked through. This can be prebaked and is relatively inexpensive performance-wise, but can have issues such as warping when viewed.

The second method uses stereo render targets to render the scene from the perspective of each eye and what it would see through the portal. The effect is very convincing, but at a higher performance cost. The functionality in this sample is split between the PortalCaptureActorBP blueprint, the MF_ScreenAlignedUV_VR material function, and the PortalSampleMaterial material. There are also some helper C++ methods contained in the PortalCaptureActor class.

Text Rendering Map

The TextRenderingMap demonstrates the effects of different texture filtering settings in combination with high-contrast textures like text, and also shows how VR stereo layers can be used to render high-quality text or textures.