Unreal Getting Started

The Oculus Source Distribution of the Unreal Engine (UE) contains a UE C++ sample project illustrating and implementing all the features available to Oculus Avatars in UE, including new expressive features.

Oculus Avatars for UE are supported on Rift and Android devices.

The example project demonstrates the following:

  • Using Avatar classes to create and destroy UE Avatar objects.
  • Hooking up the lip-sync component to drive expressive features.
  • Tagging objects in the scene as gaze targets for the expressive Avatar’s eyes.
  • Changing hand poses to custom hand poses.
  • Recording local Avatar movement packets and replaying the packets back on remote Avatars (including voice visualizations).

Note: Oculus Avatars for UE are for C++ projects. A blueprints version is not available at this time.

Requirements

  • Unreal Editor 4.21 or later from the Oculus GitHub release
  • Microsoft Visual Studio 2015 or 2017 with C++

Architecture of a UE Avatar Project

Oculus Avatars for UE are implemented as a plugin. Avatars are embodied within UOvrAvatar ActorComponents that you can attach to the UE actors you desire. This lets you keep your game-side code separate from our Avatar implementation.

Notable files in the sample project folder (UnrealEngine\Samples\Oculus\AvatarSamples) include the following:

  • Config/DefaultEngine.ini: Contains the App ID and adds Oculus Platform as a subsystem. Each game has its own DefaultEngine.ini file where settings can be added and configured. When creating apps using Avatars, you must edit this file to add the App ID.
  • Config/Android/AndroidEngine.ini: Contains Android specific overrides for configuration the online subsystem to Oculus.
  • Source/AvatarSamples/LocalAvatar.cpp and RemoteAvatar.cpp: Contain the “game-side” classes that demonstrate how to attach Avatar components to actor classes.
  • AvatarSamples.uproject: Unreal project file for the sample.

Launching the Avatar Samples Unreal Project

  1. Download, build, and launch the Oculus Source Distribution of the Unreal Engine.
  2. From the Unreal Project Browser window, press Browse and open AvatarSamples.uproject at UnrealEngine\Samples\Oculus\AvatarSamples.
  3. Click Play > VR Preview.
  4. Put on your Rift and pick up the Touch controllers.

You should see the hands of your Avatar. This first-person view where you only see your hands is referred to as the local Avatar.

The code that configures your Avatar can be found in LocalAvatar.cpp:

void ALocalAvatar::LipSyncVismesReady()
{
    if (UseCannedLipSyncPlayback)
    {
        AvatarComponent->UpdateVisemeValues(PlayBackLipSyncComponent->GetVisemes());
    }
    else
    {
        AvatarComponent->UpdateVisemeValues(LipSyncComponent->GetVisemes());
    }
}

void ALocalAvatar::PreInitializeComponents()
{
    Super::PreInitializeComponents();

    if (UseCannedLipSyncPlayback)
    {
        FString playbackAssetPath = TEXT("/Game/Audio/vox_lp_01_LipSyncSequence");
        auto sequence = LoadObject<UOVRLipSyncFrameSequence>(nullptr, *playbackAssetPath, nullptr, LOAD_None, nullptr);
        PlayBackLipSyncComponent->Sequence = sequence;

        FString AudioClip = TEXT("/Game/Audio/vox_lp_01");
        auto SoundWave = LoadObject<USoundWave>(nullptr, *AudioClip, nullptr, LOAD_None, nullptr);

        if (SoundWave)
        {
            SoundWave->bLooping = 1;
            AudioComponent->Sound = SoundWave;
        }
    }
#if PLATFORM_WINDOWS
    else
    {
        auto SilenceDetectionThresholdCVar = IConsoleManager::Get().FindConsoleVariable(TEXT("voice.SilenceDetectionThreshold"));
        SilenceDetectionThresholdCVar->Set(0.f);
    }
#endif

    // TODO SW: Fetch Player Height from Oculus Platform?
    BaseEyeHeight = 170.f;

    AvatarComponent->SetVisibilityType(
        AvatarVisibilityType == AvatarVisibility::FirstPerson
        ? ovrAvatarVisibilityFlag_FirstPerson
        : ovrAvatarVisibilityFlag_ThirdPerson);

    AvatarComponent->SetPlayerHeightOffset(BaseEyeHeight / 100.f);

    AvatarComponent->SetExpressiveCapability(EnableExpressive);
    AvatarComponent->SetBodyCapability(EnableBody);
    AvatarComponent->SetHandsCapability(EnableHands);
    AvatarComponent->SetBaseCapability(EnableBase);

    AvatarComponent->SetBodyMaterial(GetOvrAvatarMaterialFromType(BodyMaterial));
    AvatarComponent->SetHandMaterial(GetOvrAvatarMaterialFromType(HandsMaterial));
}

ALocalAvatar::ALocalAvatar()
{
    RootComponent = CreateDefaultSubobject<USceneComponent>(TEXT("LocalAvatarRoot"));

    PrimaryActorTick.bCanEverTick = true;

    AvatarComponent = CreateDefaultSubobject<UOvrAvatar>(TEXT("LocalAvatar"));
    PlayBackLipSyncComponent = CreateDefaultSubobject<UOVRLipSyncPlaybackActorComponent>(TEXT("CannedLipSync"));
    AudioComponent = CreateDefaultSubobject<UAudioComponent>(TEXT("LocalAvatarAudio"));
    LipSyncComponent = CreateDefaultSubobject<UOVRLipSyncActorComponent>(TEXT("LocalLipSync"));
}

void ALocalAvatar::EndPlay(const EEndPlayReason::Type EndPlayReason)
{
    LipSyncComponent->OnVisemesReady.RemoveDynamic(this, &ALocalAvatar::LipSyncVismesReady);
    PlayBackLipSyncComponent->OnVisemesReady.RemoveDynamic(this, &ALocalAvatar::LipSyncVismesReady);

    if (!UseCannedLipSyncPlayback)
    {
        LipSyncComponent->Stop();
    }
}

void ALocalAvatar::BeginPlay()
{
    Super::BeginPlay();

    uint64 UserID = FCString::Strtoui64(*OculusUserId, NULL, 10);

#if PLATFORM_ANDROID
    ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Three;
    if (AvatarComponent)
    {
        AvatarComponent->RequestAvatar(UserID, lod, UseCombinedMesh);
    }
#else
    ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Five;

    IOnlineIdentityPtr IdentityInterface = Online::GetIdentityInterface();
    if (IdentityInterface.IsValid())
    {
        OnLoginCompleteDelegateHandle = IdentityInterface->AddOnLoginCompleteDelegate_Handle(0, FOnLoginCompleteDelegate::CreateUObject(this, &ALocalAvatar::OnLoginComplete));
        IdentityInterface->AutoLogin(0);
    }
#endif

    if (UseCannedLipSyncPlayback)
    {
        PlayBackLipSyncComponent->OnVisemesReady.AddDynamic(this, &ALocalAvatar::LipSyncVismesReady);
    }
    else
    {
        LipSyncComponent->OnVisemesReady.AddDynamic(this, &ALocalAvatar::LipSyncVismesReady);
        LipSyncComponent->Start();
    }
}

Custom Hand Poses

Press the thumbsticks to cycle through the following hand poses:

  • A built-in pose for gripping a sphere:
AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripSphere);
  • A built-in pose for gripping a cube:
AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_GripCube);
  • A custom hand gesture built from an array of joint transforms, gAvatarRightHandTrans:
AvatarComponent->SetCustomGesture(ovrHand_Right, gAvatarRightHandTrans, HAND_JOINTS);
  • A built-in pose depicting Touch controllers:
AvatarComponent->SetRightHandPose(ovrAvatarHandGesture_Default);
AvatarComponent->SetControllerVisibility(ovrHand_Right, true);

The code snippets above are from LocalAvatar.cpp and set the poses for the right hand. For the left hand, substitute the appropriate left hand functions and constants.

Adding Avatars to An Existing Project

Avatars are implemented as a plugin in Unreal. To add Avatar support to a new or existing Unreal project, select Edit > Plugins while in the Unreal Editor. From the Plugins window, search for Oculus and enable the Oculus Avatar and Online Subsystem Oculus plugins. The Online Subsystem Oculus plugin is necessary to query and retrieve Avatars from the Oculus platform.

You can also add the Oculus Avatar and Online Subsystem Oculus plugins by editing the Modules and Plugins sections of your .uproject file. Remember to add a comma (,) to the last item in any existing Modules or Plugins sections before pasting the additional lines. This example is from AvatarSamples.uproject

"Modules": [
    {
	"Name": "AvatarSamples",
	"Type": "Runtime",
	"LoadingPhase": "Default",
	"AdditionalDependencies": [
	    "Engine",
	    "OnlineSubsystem",
	    "OnlineSubsystemUtils"
	]
    }
],
"Plugins": [
    {
	"Name": "OnlineSubsystemOculus",
	"Enabled": true
    },
    {
	"Name": "OculusAvatar",
	"Enabled": true
    }
]

Place your request to fetch the Avatar wherever you have set up online login functionality. This example is from LocalAvatar.cpp:

void ALocalAvatar::OnLoginComplete(int32 LocalUserNum, bool bWasSuccessful, const FUniqueNetId& UserId, const FString& Error)
{
	IOnlineIdentityPtr OculusIdentityInterface = Online::GetIdentityInterface();
	OculusIdentityInterface->ClearOnLoginCompleteDelegate_Handle(0, OnLoginCompleteDelegateHandle);

	bool UseCombinedMesh = true;
	const uint64 localUserID = 10150022857753417;

#if PLATFORM_ANDROID
	ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Three;
#else
	ovrAvatarAssetLevelOfDetail lod = ovrAvatarAssetLevelOfDetail_Five;
#endif

	if (AvatarComponent)
	{
		AvatarComponent->RequestAvatar(localUserID, lod, UseCombinedMesh);
	}
}

This code also demonstrates how to enable combined meshes and set the level of detail (LOD) for Avatars.

When requesting an Avatar, set UseCombinedMesh to true to enable combined meshes, which reduce draw call overhead for the Avatar’s body.

In this example, the LOD for the requested Avatar is set to ovrAvatarAssetLevelOfDetail_Five (high LOD) for Oculus Rift and ovrAvatarAssetLevelOfDetail_Three (medium LOD) for Mobile. This improves performance on Mobile by using lower resolution meshes and textures. There is also a low LOD, ovrAvatarAssetLevelOfDetail_One, which conserves even more resources. Low LOD is ideal for distant Avatars or crowds.

Expressive Features

Expressive features give Avatars more advanced facial geometry that allows for realistic and nuanced animation of various facial behaviors. Expressive features increase social presence and make interactions seem more natural and dynamic. The following facial behaviors are available for Avatars:

  • Realistic lip-syncing powered by Oculus Lipsync technology.
  • Natural eye behavior, including gaze dynamics and blinking.
  • Ambient facial micro-expressions when an Avatar is not speaking.

For more information, see Expressive Features for Avatars - Unreal.

Testing Your Integration

Once you’ve completed your integration, you can test by retrieving some Avatars in-engine. Use the following user IDs to test:

  • 10150022857785745

  • 10150022857770130

  • 10150022857753417

  • 10150022857731826