This website uses cookies to improve our services and deliver relevant ads.
By interacting with this site, you agree to this use. For more information, see our Cookies Policy
You can replace the default blue avatar with a personalized avatar using the Oculus Platform package. The base Avatar SDK OvrAvatar.cs class is already set up to load the avatar specifications of users, but we need to call Oculus Platform functions to request valid user IDs.
After getting a user ID, we can set the oculusUserID of the avatar accordingly. The timing is important, because this has to happen before the Start() function in OvrAvatar.cs gets called.

The example below shows one way of doing this. It defines a new class called PlatformManager. It extends our existing Getting Started sample. When run, it replaces the default blue avatar with the personalized avatar of the user logged on to Oculus Home.
using UnityEngine;
using Oculus.Avatar;
using Oculus.Platform;
using Oculus.Platform.Models;
using System.Collections;
public class PlatformManager : MonoBehaviour {
public OvrAvatar myAvatar;
void Awake () {
Oculus.Platform.Core.Initialize();
Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback);
Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start().
}
private void GetLoggedInUserCallback(Message<User> message) {
if (!message.IsError) {
myAvatar.oculusUserID = message.Data.ID;
}
}
}Handling Multiple Personalized Avatars
If you have a multi-user scene where each avatar has different personalizations, you probably already have the user IDs of all the users in your scene because you had to retrieve that data to invite them in the first place. Set the oculusUserID for each user 's avatar accordingly.
If your scene contains multiple avatars of the same person, you can iterate through all the avatar objects in the scene to change all their oculusUserID values. For example, the LocalAvatar and RemoteLoopback sample scenes both contain two avatars of the same player.
Here is an example of how to modify the callback of our PlatformManager class to personalize the avatars in the sample scenes:
using UnityEngine;
using Oculus.Avatar;
using Oculus.Platform;
using Oculus.Platform.Models;
using System.Collections;
public class PlatformManager : MonoBehaviour {
void Awake () {
Oculus.Platform.Core.Initialize();
Oculus.Platform.Users.GetLoggedInUser().OnComplete(GetLoggedInUserCallback);
Oculus.Platform.Request.RunCallbacks(); //avoids race condition with OvrAvatar.cs Start().
}
private void GetLoggedInUserCallback(Message<User> message) {
if (!message.IsError) {
OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[];
foreach (OvrAvatar avatar in avatars) {
avatar.oculusUserID = message.Data.ID;
}
}
}
}The Avatar Unity package contains two prefabs for Avatars: LocalAvatar and RemoteAvatar. They are located in OvrAvatar >Content > PreFabs. The difference between LocalAvatar and RemoteAvatar is in the driver, the control mechanism behind avatar movements.
The LocalAvatar driver is the OvrAvatarDriver script which derives avatar movement from the logged in user's Touch and HMD or.
The RemoteAvatar driver is the OvrAvatarRemoteDriver script which gets its avatar movement from the packet recording and playback system.
There are four sample scenes in the Avatar Unity package:
Controllers
Demonstrates how first-person avatars can be used to enhance the sense of presence for Touch users.
GripPoses
A helper scene for creating custom grip poses. See Custom Touch Grip Poses.
LocalAvatar
Demonstrates the capabilities of both first-person and third-person avatars. Does not yet include microphone voice visualization or loading an Avatar Specification using Oculus Platform.
RemoteLoopback
Demonstrates the avatar packet recording and playback system. See Recording and Playing Back Avatar Pose Updates.
Each avatar in your scene requires 11 draw calls per eye per frame (22 total). The Combine Meshes option reduces this to 3 draw calls per eye (6 total) by combining all the mesh parts into a single mesh. This is an important performance gain for Gear VR as most apps typically need to stay within a draw call budget of 50 to 100 draw calls per frame. Without this option, just having 4 avatars in your scene would use most or all of that budget.
You should almost always select this option when using avatars. The only drawback to using this option is that you are no longer able to access mesh parts individually, but that is a rare use case.
The GripPoses sample lets you change the hand poses by rotating the finger joints until you get the pose you want. You can then save these finger joint positions as a Unity prefab that you can load at a later time.
In this example, we will pose the left hand to make it look like a scissors or bunny rabbit gesture.
Creating the left hand pose:
In the Hierarchy window, expand LocalAvatar > hand_left > LeftHandPoseEditHelp > hands_l_hand_world > hands:b_l_hand.

Locate all the joints of the fingers you want to adjust. Joint 0 is closest to the palm, subsequent joints are towards the finger tip. To adjust the pinky finger joints for example, expand hands:b_l_pinky0 > hands:b_l_pinky1 > hands:b_l_pinky2 > hands:b_l_pinky3.
In the Hierarchy window, select the joint you want to rotate.

In the Scene window, click a rotation orbit and drag the joint to the desired angle.

Saving the left hand pose:
Using the left hand pose:
Click Play again. You will see that the left hand is now frozen in our custom bunny grip pose.
To make Rift avatars appear in stand-alone executable builds, we need to change two settings:
To allow avatars to interact with objects in their environment, use the OVRGrabber and OVRGrabble components. For a working example, see the AvatarWithGrab sample scene included in the Oculus Unity Sample Framework.