These Unreal samples illustrate how to implement basic VR concepts and Meta Quest features in your apps.
Hand Tracking
The Unreal-HandSample GitHub repo contains two samples illustrating the Meta Quest hand-tracking feature in Unreal Engine.
You can find the sample in the Unreal-HandSample GitHub repo. Open the HandTrackingSample or HandTrackingCustomSample maps to see how to implement hand-tracking.
The app displays two spheres that track with the Touch controllers and two UMG widgets rendered as VR compositor layers. This example renders one as a quad layer and the other as a cylinder layer.
Actor_Blueprint illustrates rendering a widget into a stereo layer. This example first renders the widget into a material then pulls the SlateUI texture from the material into the stereo layer. In other words, this sample renders the UMG widget to the quad and cylindrical layers.
Open MenuBlueprint to open the UMG widget in the UMG Editor.
NewGameMode and VRCharacter are used to initialize the scene and make the scene display at the appropriate height.
Locomotion and Interactions
This sample demonstrates a range of different VR locomotion and interaction types. It is available in the Unreal-Locomotion GitHub repo.
Locomotion
The following list details the type of locomotion types and associated player controls:
Point and Teleport: Move the thumbstick away from the neutral position on either controller and point where you want to go. An arc and indicator on the ground will show the destination and orientation you’ll teleport to. Rotate the stick to change your target orientation. When the stick returns to neutral, you will be teleported to the destination.
Point and Teleport (With Third-Person Avatar): This operates identically to point and teleport for a third-person avatar that walks to its destination.
Stepped Translation and Rotation: Push the left stick forward or backward to jump increments. Push the right stick left or right to rotate in 35-degree intervals.
Grab and Drag: Hold down the action buttons or triggers and move the controllers over the ground to drag yourself in a direction.
Arm Swinging: Hold down the action buttons or triggers and swing your arms to move forward in the direction the controllers are pointing.
Dual Stick Walking: The left stick controls translation, and the right stick controls rotation, just like in a standard first-person non-VR game. This type of movement can be uncomfortable in VR, so we recommend you only try it briefly or without the headset fully covering your vision until you are used to it.
You’ll implement these locomotion methods within the MotionControllerPawn blueprint.
Interactions
This sample app focuses on two-handed object manipulation, which can be tricky. Therefore, this example provides several interactive objects for you to examine.
Blue cubes: These can be picked up and thrown using the grab button on either controller. We have implemented this interaction in the BP_PickupCube blueprint.
Gun: To pick the weapon up, grab it with one hand. Bring the other hand near the barrel and grab to control the aim. The firearm will stretch based on the distance between your hands. We’ve implemented this interaction in the BP_PickupTwoHandedAim blueprint.
Blue pole: To pick up the pole, grab it with one hand. This hand will control its position and orientation. Grabbing the pole with a second hand will activate the two-handed mode. If you keep the pole aligned between your hands, your first hand will maintain the position. We’ve implemented this interaction in the BP_PickupTwoHandedCylinder blueprint.
Bow: To pick up the bow, grab it with one hand. Grab it with the other hand to begin drawing an arrow. Release the second hand to fire the arrow. We’ve implemented this interaction in the BP_Bow blueprint.
All interactions work with the PickupActorInterface blueprint and are coordinated through the blueprints BP_MotionController and MotionControllerPawn.
Rendering Techniques
This sample, which contains several maps demonstrating shadow, portal, and text rendering techniques, can be found in the Unreal-RenderingTechniques GitHub repo.
Below, you’ll find information for each of the provided maps. For more detailed information, see the sample’s readme.
When the sample is running, use the triggers on the touch controller to drag the sliders in the map. A blueprint attached to the light controls its rotation.
Color Grade Map
This map demonstrates how to use look-up tables (LUTs) to adjust the look of your scene.
The color grading map requires applying the patch in the ColorGradingLUTPatch subdirectory. Once you have applied the patch, the color-grading LUTs will be available for you to use. To learn more about LUTs for color grading, go to Using Lookup Tables (LUTs) for Color Grading.
Distance Field Baked Shadows Map
This map demonstrates distance field shadows. For distance field shadows to work, set the light type to stationary and enable Engine > Rendering > Mobile Shader Permutation Reduction > Support Pre-baked Distance Field Shadows in project settings as shown below:
Portals Map
This map demonstrates two different methods of rendering portals in VR.
The first uses a static parallax-corrected cube map captured at the location of the portal you’re looking through. This map can be prebaked and is relatively inexpensive performance-wise, but it can have issues such as warping when viewed.
The second method uses stereo render targets to render the scene from the perspective of each eye and what it would see through the portal. The effect is realistic but has a higher performance cost. This example splits the functionality between the PortalCaptureActorBP blueprint, the MF_ScreenAlignedUV_VR material function, and the PortalSampleMaterial material. The PortalCaptureActor class also contains some helper C++ methods.
Text Rendering Map
This map demonstrates the effects of different texture filtering settings in combination with high-contrast textures like text. It also shows how you can use VR stereo layers to render high-quality text or textures.
Platform Solutions
This sample provides an example of initializing the Platform SDK, performing an entitlement check, and implementing different platform features. If you haven’t already done so, download and unzip the Platform SDK. You can find the sample within \Samples\UnrealSample\ inside the directory where you extracted the SDK.
For setup instructions, see the sample’s readme. For additional information, go to Sample Apps in the Platform Solutions section.
Unreal Engine 4.27 samples
The following samples are for Unreal Engine 4.27 only.
Boundary/Guardian
This sample demonstrates how to:
Create a Blueprint that accesses the Meta Guardian System,
Position, scale, and rotate objects with respect to the Play Area,
and determine when the headset or controllers cross the Guardian boundary.
The sample only works with 6DOF (six degrees of freedom) headsets such as Oculus Rift and Meta Quest. You can find it under \Samples\Oculus\BoundarySample inside your Meta-integrated Unreal Engine installation path.
Cloud Saves and Downloadable Content (DLC)
This sample shows how to use:
The add-ons feature to provide DLC and in-app purchases,
and the cloud storage feature to save data to the cloud.
You can find it under \Samples\Oculus\CloudSaveDLC inside your Meta-integrated Unreal Engine installation path. See the sample’s readme file for setup instructions and a detailed description of the sample.
To see how these features can be implemented, open DLCWidget.cpp and CloudSaveWidget.cpp inside \Samples\Oculus\CloudSaveDLC\Source\CloudSaveDLC\Private\.
Input
The Touch sample illustrates tracking, thumbstick control, and haptics control using the PlayHapticEffect() and PlayHapticSoundWave() functions.
You can find it under \Samples\Oculus\TouchSample inside your Oculus-VR fork Unreal Engine installation path.
When the sample is running, two spheres track with the Touch Controllers. You can use the right controller thumbstick to control the position of the light gray box. Touch capacitance sensors detect whether the right thumb is in a touch or near-touch position, which controls the height of the box. Press and hold the left Touch grip button to play a haptics clip. Press and hold the left Touch X button to create a haptic effect by setting the haptic value directly. The following image shows an example:
The Touch sample level blueprint contains the Haptics and Thumbstick control blueprint. NewGameMode and VRCharacter initialize the scene, display it at the appropriate height, and so on.