After completing this section, the developer should:
Understand the different tracking implementations available for body, face and eye tracking.
Understand the benefits and downsides of each tracking implementation.
Understand which tracking implementation are best suited for their project.
Our tracking services offer three implementation options, each of which with unique advantages and disadvantages. The choice depends on your project’s specific requirements. These options define how tracking data is fetched, applied to the character rig, and the level of control you have over the data. The first is recommended as it meets most use cases.
Animation Node (Recommended)
Unreal has a powerful Animation Blueprint system that allows you to blend animations and manipulate the bones of a character rig. This method provides retargeting features which allow the developer to support custom avatar skeletons. It is for developers that want to mix tracked motion with key-frame animation or want to adapt to different skeletons.
Actor component
Actor Components are supported for developers who want a lightweight interface that can be accessed directly from the Actor. These are small blueprint components that you can attach to other blueprints. They allow you to break the code into manageable groups and attach them to other blueprints to add functionality. For more detail on this method, refer to Actor Component Implementation. This implementation is recommended if you are not using our retargeting features and are okay with executing in the context of the main game thread.
OculusXREyeTrackerModule
In the case of eye tracking, you can also use Unreal’s eye tracking IEyeTrackerModule. Therefore, you can use native to Unreal Engine types and function libraries. See the documentation of Unreal Engine Eye Tracker
Tracking using the Animation Node implementation
Pre-requisite: The OculusXRMovement plugin distributed with the Movement Sample must be installed.
After completing this section, the developer should be able to apply body tracking, eye tracking, or face tracking to a humanoid character model using the Animation Node implementation.
At a high level the process consists of the following steps:
Import of the Character. This step illustrates this concept using an FBX file, but Unreal provides many mechanisms for importing character assets, so feel free to import in any supported manner.
Create an Animation Blueprint. This blueprint will map from the body tracking, eye tracking, or face tracking input to the Pawn.
Create a Pawn. In Unreal, a Pawn is used to control characters with locomotion and provides an anchor for the skinned mesh.
Test and verify. This ensures you have a working setup.
This method also supports Unreal Metahuman. After a quick scan of this section to become familiar with the Animation Node Implementation, refer to Animation Node with Metahuman as the import of the character and creation of the Pawn differ from the default Animation Node procedures described in this section.
Step 1: Import the character
If you already have a character model in the project, you can skip this step. If not, follow these steps to import your character model:
Navigate in your content to the directory in which you want to store your character.
Import the FBX file containing your character model.
This should create several assets including a Skeletal Mesh asset and a Skeleton Asset. For ease of reference later, these are named as “SK_MySkeletalMesh” and “SKEL_MySkeleton” in this example.
Step 2: Create an animation blueprint
Create a new Animation Blueprint from the skeleton you just imported (for example, “SKEL_MySkeleton”) and name it something that makes sense to you (for example, “ABP_MyPawn”).
Open the Animation Blueprint you just created.
Step 2a: Animation blueprint for body tracking
This section describes the changes for Body Tracking. If you don’t want to have body tracking, skip to Step 2b: Animation Blueprint for Face Tracking
Add the OculusXR Body Tracking node and connect it to the Output Pose.
Create an Input Pose node and connect to the OculusXR Body Tracking node as shown above. You can leave the Debug Pose Mode and Debug Draw Mode settings set to None. These settings are discussed more in the Troubleshooting Section. Leave the final field Root Motion Behavior as Combine Motion Into Root. The explanation of the other modes are:
Combine Motion into Root (Default setting in Unreal for best compatibility with locomotion systems.) Extracts the Yaw rotation from the Hip Joint and applies it to the root joint so the root rotates with the character. When the user jumps, upward translation when the feet leave the ground is applied to the root instead of the hip.
Root Translation with Full Hip Rotation (The raw behavior from Movement SDK’s Open XR Output.) All rotation and upward translation is applied to the hip joint. Flat translation from the tracking origin is applied to the root joint. No rotation is applied to the root joint.
Zero Root Translation with Zero Hip Yaw Used for locking the character in place always facing a developer specified direction. Similar to Combine Motion Into Root, all yaw and upward translation is extracted from the hip, but the root is zeroed out so the user stays in a fixed position. This could be used for a character selection hallway, or fixed communications (like Avatar calling).
Select the OculusXR Body Tracking and look at the details panel.
Set Retargeting Mode to Rotation & Positions. This defines how the incoming body tracking data is combined with the existing skeleton of your character. For an in-depth introduction on how each of these work, see the following guide: Body Tracking Modes.
Set Forward Mesh to be the direction your mesh is facing in the preview. In our case this is +X.
Bone Mapping: In the Animation Blueprint, select the OculusXR Body Tracking node and perform the following steps:
In the details panel expand the Bone Remapping property. The default values match the Unreal 5 Mannequin, so if your character is based on that rig you can skip ahead. This map defines how each bone in the Oculus Body Tracking Skeleton (left column) matches to each bone in your character’s skeleton (right column).
Go through each bone and update the right-hand column with the names of the matching bones in your character rig. Look at bone joints in order to understand the location of each bone in the Oculus Body Tracking Skeleton. If your skeleton has more bones than the Oculus Body Tracking Skeleton, you need to decide which bones to skip. In these cases, read through the section Bone Mapping for a Custom Rig on how to map the bones correctly. Follow these guidelines:
It is very helpful to open your skeleton in a separate window and turn on Bones in the visualizer while setting this up.
If your skeleton has less bones than the Oculus Body Tracking Skeleton you need to decide which bones should be mapped to None. Step 2b: Animation Blueprint for face tracking
This section describes the changes for face tracking. If you don’t want to have face tracking, skip to Step 2c: Animation Blueprint for eye tracking.
Add the OculusXR Face Tracking node and connect it to the Output Pose
Morph Target Mapping. In the Animation Blueprint, select the OculusXR Face Tracking node to follow these steps:
In the details panel, expand the Expression Names property. This map defines how each curve in the Oculus Face Tracking Blendshapes (left column) matches to each curve in your character’s Morph Targets (right column).
Go through each curve and update the right hand column with the names of the matching curves in your character rig. Look at reference morph targets in order to understand what each morph target refers to on the Oculus Body Tracking Skeleton face. It is very helpful to open your skeleton rig in a separate window and switch to the Morph Target Preview while performing this work.
Expression Modifiers. Expression modifiers allow the user to amplify or clamp specific morph target values for facial expressions. To apply this, select the face tracking node in the animation graph, and add entries to the ExpressionModifiers property in the node. Each entry maps a morph target to a struct containing min, max and multiplier. When the node receives data from the runtime, modifiers are applied to the mapped expressions, multiplying the value, and clamping it by min and max. In this example, there is a modifier that disables the right eyelid expression, by multiplying it with zero. Setting the max value to zero would have the same effect. By default, a modifier will have a min of zero, a max of one, and a multiplier of one, resulting in no change to the expression.
Step 2c: Animation Blueprint for eye tracking
This section describes the changes for eye tracking. If you don’t want to have eye tracking, skip this section.
Add the OculusXR Eye Tracking node and connect it to the Output Pose.
In the Animation Blueprint, select the OculusXR Eye Tracking node and enter the bone names of the left and right eye. Eye tracking works by changing the rotation of the bones according to the incoming data.
Note: All the movement tracking nodes are designed to be used together and can be placed in any order.
Step 3: Create a pawn
This section describes how to start body, face, or eye tracking from the Pawn. If you already have a pawn, skip the creation, but make sure that you implement the required changes to the Animation Blueprint for the pawn outlined below.
Create a new blueprint that inherits from Pawn and give it a name of your choice (for example, “MyPawn”).
Open MyPawn and add a Skeletal Mesh component.
Select the Skeletal Mesh component and perform the following settings:
Set the Skeletal Mesh Asset to the skeletal mesh you imported in the step above.
Set the Anim Class to the Animation Blueprint you created in the previous step: “ABP_MyPawn”.
Rotate the Skeletal Mesh to face forward along the actor forward axis. Example: Unreal Engine 5’s Mannequin needs a -90 degrees along the Z axis.
Open the Event Graph of your Pawn (for example, “MyPawn”).
Add a 0.2 second Delay node and a Set Tracking Origin node with Floor Level selected after the Begin Play node.
Enable tracking as follows.
To enable body tracking, add a Start Body Tracking by Joint Set node as shown below:
To enable face tracking, add the Start Face Tracking node as shown below:
To enable eye tracking, add the Start Eye Tracking node as shown below:
Note: All the start tracking nodes are designed to be used together.
Step 4: Test and verify
If you want the pawn to be created in 1P, navigate to your GameMode. If using the VR Template, this is called VRGameMode. Select Class Defaults and change the Default Pawn Class to your newly created pawn (for example, “MyPawn”). This ensures your pawn will be created and possessed when the game starts.
If testing third person embodiment, add the pawn to the scene somewhere you can see it. (Scaling the pawn -1 in the X axis will cause it to mirror the movements).
To confirm operation, you can:
Build, deploy and run on your headset to test that the tracking works on device.
Play in editor by selectiing VR Preview to test that tracking works over Link.
Animation Node with Metahuman
After completing this section, the developer should:
Understand how to download and add Unreal’s metahuman to an existing project.
Be able to apply body tracking and face tracking for the metahuman imported.
Prerequisites
The Unreal Project must be set up to support body, face, or eye Tracking with the Meta XR Plugin installed. See Unreal Project Setup for instructions.
The reader should be familiar with setting up tracking using the Animation Node procedure described in Tracking Using Animation Node.
Step 1: Adding a Metahuman to the project
Unreal’s Quixel Bridge is used to configure and download a metahuman. Follow these instructions to download your metahuman asset:
Quixel Bridge.
Once downloaded, export the Metahuman asset through Quixel Bridge into the Unreal Engine using the following link: Quixel Bridge Export.
Use Content Drawer to verify that you have a blueprint for your metahuman. In the following image, you see the example of a Metahuman named Malika. Note that the realistic photo in the thumbnail may not show up until you open the blueprint for the first time.
Step 2: Create the Animation Blueprint
Create an Animation Blueprint with the Metahuman skeleton. The creation of the Animation Blueprint should provide a list of skeletons in your project from which you can choose. Usually Metahuman uses the “metahuman_base_skel” skeleton. Give the created Animation Blueprint the name of your choice. For comparision, you can open “ABP_Malika” in our example and click on the skeleton icon at the top of the screen to identify the skeleton (“metahuman_base_skel”).
Step 2a: Animation Blueprint for body tracking
This section will provide a brief overview of the steps required for setting up body tracking. They are described in more detail in the Tracking Using Animation Node section of this document.
Add the OculusXR Body Tracking node to your Animation Blueprint.
Connect OculusXR Body Tracking node to the Output Pose.
Select OculusXR Body Tracking and confirm each of the settings for body tracking. These are explained in more detail in Tracking Using Animation Node
The default values for bone mapping match both Metahuman and Unreal 5 Mannequin, so you shouldn’t need to change these.
Confirm retargeting mode is set to your desired tracking mode. For more information on the retargeting modes, see retargeting modes
Step 2b: Animation Blueprint for face tracking
The default values for morph targets of the OculusXR Face Tracking node DO NOT match Metahuman, so you need to update them. The ABP_Metahuman_Template is provided with the sample project which provides the mapping from the detected face motions to the Metahuman morph targets. This file can be found at the Content/Pawns/Metahuman folder with the name ABP_Metahuman_Template.
Copy the ABP_Metahuman_Template to your project.
Open the ABP_Metahuman_Template. Select the OculusXR Face Tracking node and copy it into the Animation Blueprint you created in step 2 above (for example, “ABP_Malika”).
Select OculusXR Face Tracking in your target project and expand the details panel. If you expand out the Expression Modifiers you should have a mapping like shown in the figure below.
Add an Input Pose node and connect to OculusXR Face Tracking as shown in the figure below.
Save the Blueprint.
Step 2c: Animation Blueprint for eye tracking
The eyes of a Metahuman don’t have linked bones and are controlled by face-tracking blendshapes.
Step 3: Create a pawn
Create a Pawn and give it a name of your choice.
Open the Pawn and drag the Blueprint class that was created when you imported Metahuman into the DefaultSceneRoot for the Pawn. For our example it’s “BP_Malika” located in the Content/MetaHumans/Malika folder (see the screenshot from Step 1 above).
Check that the added character is facing to the front of the pawn.
Open the Pawn use the Viewport to note the orientation of the Pawn (shown by the axis arrow in the lower left of the frame). Since Unreal Engine expects the forward direction to be along the X-Axis, we know the character should be facing to the right (along the X Axis)
For the example below, it means we need to apply a -90 degree rotation so that the character now faces along the X-axis.
Select the Blueprint class from step 2 and open the details section. Under Child Actor Component -> Child Actor Template -> Live Retarget as shown in the image below.
Ensure Retarget Orig Body Anim Mode is set to Use Animation Blueprint
Set Retarget Orig Animation Class to the Animation Blueprint you created in Step 2 (for example “ABP_Malika”)
Open the event graph of the pawn.
Add a 0.2 second Delay node and a Set Tracking Origin node set to Floor Level after the Begin Play node.
Enabling tracking as follows:
To enable, body tracking, add a Start Body Tracking by Joint Set node as shown below:
To enable face tracking, add the Start Face Tracking node as shown below:
Compile and save.
Step 4: Test and verify
Confirm in Project Settings that OpenXR plugins other than Meta XR plugin are not enabled. Some of the templates like the VR Template have rules indicating a dependency and if you have accidentally clicked OK to enable them, this will cause the Meta XR plugin to not work properly.
If you want the pawn to be created in 1P, navigate to your GameMode. If using the VR Template this is called VRGameMode. Select Class Defaults and change the Default Pawn Class to your newly created pawn (for example “MyPawn”). This ensures your pawn will be created and possessed when the game starts.
If testing third person embodiment, add the pawn to the scene somewhere you can see it. (Scaling the pawn -1 in the X axis will cause it to mirror the movements).
In order to confirm operation, you can:
Build, deploy and run on your headset to test that the tracking works on device.
Play in editor by selectiing VR Preview to test that tracking works over Link.
Body tracking Calibration API
After completing this section, the developer should:
Understand when the Calibration API might be useful.
Apply the Calibration API to override the auto-detected height with an explicit height determined by the app-specific calibration.
Auto-calibration is a process by which the system tries to determine the height of the person wearing the headset. The height is necessary to ensure the detection of standing, squatting, or sitting. The auto-calibration routines run within the first ten seconds of the service being created, so the initial state when requesting the service is important. Ideally, the application should ensure that the user is standing when the service is launched. If the person is in a seated position, they could also extend their arm and draw a circle of around 0.3 meters (1 foot) in diameter with their arm fully extended. In this case, height can be estimated by wingspan.
However, there are some cases in which the auto-calibration might not work sufficiently (for example, the person remains sitting and doesn’t stretch out their arm). There are other situations in which the app might already have gone through an initialization process to determine the person’s height and would prioritize its use. The Calibration API provides the SuggestBodyTrackingCalibrationOverride(float height) where height is specified in meters to override auto-calibration. Reset the Body Tracking Calibration to its default state and initiate the automatic recalibration again through the ResetBodyTrackingCalibration().
If using auto-calibration:
Prompt the user to stand or if sitting to extend the arms at a 45-degree angle to the body and rotate in a circle as described earlier..
Call ResetBodyTrackingCalibration().
Wait 10 seconds and indicate to the user the calibration is complete.
If using manual calibration in your app, use SuggestBodyTrackingCalibrationOverride(float height) to specify the height of the player. This is the user’s height and should have a 1-2 cm offset to allow for shoes. This is not the center-eye position, but the user’s height.
Calibration Blueprint
For suggesting the user height, you can use the Suggest Body Tracking Calibration Override function. It requires a height value, in meters.
Reset the body tracking calibration to its default state and initiate the automatic recalibration again. You can do this through the Reset Body Tracking Calibration function.