Unreal Getting Started - Rift

The Oculus Source Distribution of the Unreal Engine (UE) contains a UE C++ sample project illustrating and implementing all the features available to Oculus Avatars in UE.

The example project demonstrates the following:

  • Using Avatar classes to create and destroy UE Avatar objects.
  • Changing hand poses to custom hand poses.
  • Recording local Avatar movement packets and replaying the packets back on remote Avatars (including voice visualizations).
Note: Oculus Avatars for UE are for C++ projects. A blueprints version is not available at this time.

Requirements

  • Unreal Editor 4.20
  • Note: Avatars are not yet supported on the newly released Unreal Engine 4.21
  • Microsoft Visual Studio 2015 or 2017 with C++

Architecture of a UE Avatar Project

Oculus Avatars for UE are implemented as a plugin. Avatars are embodied within UOvrAvatar ActorComponents that you can attach to the UE actors you desire. This lets you keep your game-side code separate from our Avatar implementation.

Notable files in the sample project folder (UnrealEngine\Samples\Oculus\AvatarSamples) include the following:

  • Config/DefaultEngine.ini: Contains the App ID and adds Oculus Platform as a subsystem. Each game has its own DefaultEngine.ini file where settings can be added and configured. When creating apps using Avatars, you must edit this file to add the App ID.
  • Source/AvatarSamples/LocalAvatar.cpp and RemoteAvatar.cpp: Contain the "game-side" classes that demonstrate how to attach Avatar components to actor classes.
  • AvatarSamples.uproject: Unreal project file for the sample.

Launching the Avatar Samples Unreal Project

  1. Download, build, and launch the Oculus Source Distribution of the Unreal Engine.
  2. From the Unreal Project Browser window, press Browse and open AvatarSamples.uproject at UnrealEngine\Samples\Oculus\AvatarSamples.
  3. Click Play > VR Preview.
  4. Put on your Rift and pick up the Touch controllers.

You should see the hands of your Avatar. This first-person view where you only see your hands is referred to as the local Avatar.

The code that spawns your first-person Avatar is in LocalAvatar.cpp:

ALocalAvatar::ALocalAvatar()
{
	RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("LocalAvatarRoot"));

	PrimaryActorTick.bCanEverTick = true;
	AutoPossessPlayer = EAutoReceiveInput::Player0;

	BaseEyeHeight = 170.f;

	AvatarComponent = CreateDefaultSubobject<UOvrAvatar>(TEXT("LocalAvatar"));
	AvatarComponent->SetVisibilityType(ovrAvatarVisibilityFlag_FirstPerson);
	AvatarComponent->SetPlayerHeightOffset(BaseEyeHeight / 100.f);
}

Spawning and Destroying Remote Avatars

Squeeze the right Touch trigger to spawn Avatars in a circle around you. Squeeze the left Touch trigger to destroy them. These third-person Avatars with hands, heads, and base cones represent other people and are called remote Avatars.

The remote Avatars in this sample mimic your movements because they are hooked up to the Avatar packet recording and playback system. This system records your local Avatar's movements and microphone amplitude, then transmits this data to the remote Avatars and animates them accordingly. Speak or sing to see the mouth animations on the remote Avatars.

The packet recording is handled by ALocalAvatar::UpdatePacketRecording(float DeltaTime) in LocalAvatar.cpp.

Packet playback on remote Avatars is handled by ARemoteAvatar::Tick in RemoteAvatar.cpp. You might notice a small delay in the response between your local Avatar's movements and the corresponding movement in the remote Avatars. This is an artificial delay added to the sample to simulate network latency.

To toggle packet recording and playback, press A on your right Touch.

Custom Hand Poses

Press the thumbsticks to cycle through the following hand poses:

  • A built-in pose for gripping a sphere:

    AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripSphere);
  • A built-in pose for gripping a cube:

    AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripCube);
  • A custom hand gesture built from an array of joint transforms, gAvatarRightHandTrans:

    AvatarComponent->SetCustomGesture(ovrHand_Right, gAvatarRightHandTrans, HAND_JOINTS);
  • A built-in pose depicting Touch controllers:

    AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_Default);
    AvatarComponent->SetControllerVisibility(ovrHand_Right, true);

The code snippets above are from LocalAvatar.cpp and set the poses for the right hand. For the left hand, substitute the appropriate left hand functions and constants.

Detaching and Moving Hands Independent of Tracking

Press Y on the left Touch or B on the right Touch to detach the Avatar hands from Touch tracking. You can then use the thumbsticks to drive the Avatar hand movements.

The following code in LocalAvatar.cpp detaches the hands:

AvatarHands[ovrHand_Right] = AvatarComponent->DetachHand(ovrHand_Right);

ALocalAvatar::DriveHand drives the hand movement after detaching.

Adding Avatars to An Existing Project

Avatars are implemented as a plugin in Unreal. To add Avatar support to a new or existing Unreal project, select Edit > Plugins while in the Unreal Editor. From the Plugins window, search for Oculus and enable the Oculus Avatar and Online Subsystem Oculus plugins. The Online Subsystem Oculus plugin is necessary to query and retrieve Avatars from the Oculus platform.

You can also add the Oculus Avatar and Online Subsystem Oculus plugins by editing the Modules and Plugins sections of your .uproject file. Remember to add a comma (,) to the last item in any existing Modules or Plugins sections before pasting the additional lines. This example is from AvatarSamples.uproject

"Modules": [
    {
	"Name": "AvatarSamples",
	"Type": "Runtime",
	"LoadingPhase": "Default",
	"AdditionalDependencies": [
	    "Engine",
	    "OnlineSubsystem",
	    "OnlineSubsystemUtils"
	]
    }
],
"Plugins": [
    {
	"Name": "OnlineSubsystemOculus",
	"Enabled": true
    },
    {
	"Name": "OculusAvatar",
	"Enabled": true
    }
]

Place your request to fetch the Avatar wherever you have set up online login functionality. This example is from LocalAvatar.cpp:

void ALocalAvatar::OnLoginComplete(int32 LocalUserNum, bool bWasSuccessful, const FUniqueNetId& UserId, const FString& Error)
{
	IOnlineIdentityPtr OculusIdentityInterface = Online::GetIdentityInterface();
	OculusIdentityInterface->ClearOnLoginCompleteDelegate_Handle(0, OnLoginCompleteDelegateHandle);

	bool UseCombinedMesh = true;
	const uint64 localUserID = 10150022857753417;

#if PLATFORM_ANDROID
	ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Three;
#else
	ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Five;
#endif

	if (AvatarComponent)
	{
		AvatarComponent->RequestAvatar(localUserID, lod, UseCombinedMesh);
	}
}

This code also demonstrates how to enable combined meshes and set the level of detail (LOD) for Avatars.

When requesting an Avatar, set UseCombinedMesh to true to enable combined meshes, which reduce draw call overhead for the Avatar's body.

In this example, the LOD for the requested Avatar is set to ovrAvatarAssetLevelOfDetail_Five (high LOD) for Oculus Rift and ovrAvatarAssetLevelOfDetail_Three (medium LOD) for Mobile. This improves performance on Oculus Go and Samsung Gear VR by using lower resolution meshes and textures. There is also a low LOD, ovrAvatarAssetLevelOfDetail_One, which conserves even more resources. Low LOD is ideal for distant Avatars or crowds.


Testing Your Integration

Once you’ve completed your integration, you can test by retrieving some Avatars in-engine. Use the following user IDs to test:

  • 10150022857785745

  • 10150022857770130

  • 10150022857753417

  • 10150022857731826