This website uses cookies to improve our services and deliver relevant ads.
By interacting with this site, you agree to this use. For more information, see our Cookies Policy
This version of the guide is out of date. Click here for the latest version.
The Avatar Unity package contains several prefabs you can drop into your existing Unity projects. This tutorial shows you how to start using them.
The LocalAvatar prefab renders the user's Avatar and hands. You can choose which parts of the Avatar you want to render: body, hands, and Touch controllers.
To render Avatar hands with Touch controllers:
Click Play to test. Try out the built-in hand poses and animations by playing with the Touch controllers.
Click Play to test. Squeeze and release the grips and triggers on the Touch controllers and observe how the finger joints transform to change hand poses.
The Avatar packet recording system saves Avatar movement data as packets you can send across a network to play back on a remote system. Lets take a quick tour of the RemoteLoopbackManager script.
Open the RemoteLoopback scene in OvrAvatar > Samples > RemoteLoopback.
Set RecordPackets to true to start the Avatar packet recording system. Also, subscribe to the event handler PacketRecorded so that you can trigger other actions each time a packet is recorded.
void Start () {
LocalAvatar.RecordPackets = true;
LocalAvatar.PacketRecorded += OnLocalAvatarPacketRecorded;
}Each time a packet is recorded, the code places the packet into a memory stream being used as a stand-in for a real network layer.
void OnLocalAvatarPacketRecorded(object sender, args)
{
using (MemoryStream outputStream = new MemoryStream())
{
BinaryWriter writer = new BinaryWriter(outputStream);
writer.Write(packetSequence++);
args.Packet.Write(outputStream);
SendPacketData(outputStream.ToArray());
}
}The remainder of the code receives the packet from the memory stream for playback on the loopback avatar object.
void SendPacketData(byte[] data)
{
ReceivePacketData(data);
}
void ReceivePacketData(byte[] data)
{
using (MemoryStream inputStream = new MemoryStream(data))
{
BinaryReader reader = new BinaryReader(inputStream);
int sequence = reader.ReadInt32();
OvrAvatarPacket packet = OvrAvatarPacket.Read(inputStream);
LoopbackAvatar.GetComponent<OvrAvatarRemoteDriver>().QueuePacket(sequence, packet);
}
}You can replace the default blue Avatar with a personalized Avatar using the Oculus Platform package. The base Avatar SDK OvrAvatar.cs class is already set up to load the avatar specifications of users, but we need to call Oculus Platform functions to get valid user IDs.
After getting a user ID, we then can set the oculusUserID of the Avatar accordingly. The timing is important, because we have to set the user ID before the Start() function in OvrAvatar.cs gets called.
The example below shows one way of doing this. It defines a new class that controls the platform. After modifying the sample with our new class, the Avatar SDK shows you the personalized Avatar of the current Oculus Home user instead of the default blue Avatar.
using UnityEngine;
using Oculus.Avatar;
using Oculus.Platform;
using Oculus.Platform.Models;
using System.Collections;
public class <classname> : MonoBehaviour {
public OvrAvatar myAvatar;
void Awake () {
Oculus.Platform.Core.Initialize();
Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback);
Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start().
}
private void GetLoggedInUserCallback(Message<User> message) {
if (!message.IsError) {
myAvatar.oculusUserID = message.Data.ID;
}
}
}Handling Multiple Personalized Avatars
In a multi-user scene where each avatar has different personalizations, you already have the user IDs of all the users in your scene because you had to retrieve that data to invite them in the first place. Set the oculusUserID for each user 's Avatar accordingly.
If your scene contains multiple Avatars of the same person, such as in our LocalAvatar and RemoteLoopback sample scenes, you can iterate through all the Avatar objects in the scene to change all their oculusUserID values. Here is an example of how to modify the callback of our new class to personalize the Avatars in those two sample scenes:
using UnityEngine;
using Oculus.Avatar;
using Oculus.Platform;
using Oculus.Platform.Models;
using System.Collections;
public class <classname> : MonoBehaviour {
void Awake () {
Oculus.Platform.Core.Initialize();
Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback);
Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start().
}
private void GetLoggedInUserCallback(Message<User> message) {
if (!message.IsError) {
OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[];
foreach (OvrAvatar avatar in avatars) {
avatar.oculusUserID = message.Data.ID;
}
}
}
}The Avatar Unity package contains two prefabs for Avatars: LocalAvatar and RemoteAvatar.
They are located in OvrAvatar > Content > PreFabs. The difference between LocalAvatar and RemoteAvatar is in the driver, the control mechanism behind avatar movements.
The LocalAvatar driver is the OvrAvatarDriver script, which derives Avatar movement from the logged-in user's controllers and HMD.
The RemoteAvatar driver is the OvrAvatarRemoteDriver script, which gets its Avatar movement from the packet recording and playback system.
Dynamic lighting of your Avatar ensures that your user’s Avatar looks and feels at home in your scene. The primary light in your scene is used to calculate lighting.
If you must have multiple real-time light sources, which is highly discouraged, then can set the primary light source in Unity’s lighting settings.
The _Cubemap texture is designed to work with reflection probes and applies the reflection according to the alpha channel of the roughness map.
The GripPoses sample lets you change the hand poses by rotating the finger joints until you get the pose you want. You can then save these finger joint positions as a Unity prefab that you can load at a later time.
In this example, we pose the left hand to make it look like a scissors or bunny rabbit gesture.
Creating the left hand pose:
In the Hierarchy window, expand LocalAvatar > hand_left > LeftHandPoseEditHelp > hands_l_hand_world > hands:b_l_hand.

Locate all the joints of the fingers you want to adjust. Joint 0 is closest to the palm, subsequent joints are towards the finger tip. To adjust the pinky finger joints for example, expand hands:b_l_pinky0 > hands:b_l_pinky1 > hands:b_l_pinky2 > hands:b_l_pinky3.
In the Hierarchy window, select the joint you want to rotate.

In the Scene window, click a rotation orbit and drag the joint to the desired angle.

Saving the left hand pose:
Using the left hand pose:
Click Play again. You see that the left hand is now frozen in our custom bunny grip pose.
To let avatars interact with objects in their environment, use the OVRGrabber and OVRGrabble components. For a working example, see the AvatarWithGrab sample scene included in the Oculus Unity Sample Framework.
To make Rift Avatars appear in stand-alone executable builds, you need to change two settings.
Once you’ve completed your integration, you can test by retrieving some Avatars in-engine. Use the following user IDs to test:
10150022857785745
10150022857770130
10150022857753417
10150022857731826