Unity Developer Guide - Rift

The Oculus Integration for Unity package contains several Oculus Avatar prefabs you can drop into your existing Unity projects. This guide shows you how to start using them.

Configure Your Unity Project

  1. Create a new project in Unity.
  2. There are two ways to import the Oculus APIs into the Unity Editor. You can either:
    • Navigate to the Oculus Integration page and select Import.
    • In the Unity Editor, select the Asset Store tab, search for “Oculus Integration”, and select Import.
  3. From Edit > Project Settings > Player, select the Virtual Reality Supported check box.
  4. From the Hierarchy, delete Main Camera from your scene.
  5. From the Project window, drag OVRCameraRig from Assets > Oculus > VR > Prefabs.
  6. Reset the transform on OVRCameraRig.
  7. From Oculus > Avatars > Edit Settings, add a valid Oculus Rift App Id.

Note: During development, a valid App ID is not required, but highly recommended. You may ignore any No Oculus Rift App ID warnings you see during development. While an App ID is required to retrieve Oculus Avatars for specific users, you can prototype and test experiences that make use of Touch and Avatars with just the default blue Avatar.

Adding an Avatar to the Scene

The LocalAvatar prefab renders the user’s Avatar and hands. You can choose which parts of the Avatar you want to render: body, hands, and Touch controllers.

Note: A valid App ID is required to retrieve Oculus Avatars for specific users. To add an App ID, go to Oculus > Avatars > Edit Settings.

To render Avatar hands with Touch controllers:

  1. Drag Assets > Oculus > Avatar > Content > Prefabs > LocalAvatar to the Unity Hierarchy window.
  2. In the Unity Inspector window, select the Start With Controllers check box.

Click Play to test. Try out the built-in hand poses and animations by playing with the Touch controllers.

  1. In the Hierarchy window, select LocalAvatar.
  2. In the Inspector window, clear the Start With Controllers check box.

Click Play to test. Squeeze and release the grips and triggers on the Touch controllers and observe how the finger joints transform to change hand poses.

You can see what the Avatar looks like from a third-person perspective by doing the following:

  1. In the Hierarchy window, select LocalAvatar.
  2. In the Inspector window, select the Show Third Person check box.
  3. Change Transform > Position to X:0 Y:0 Z:1.5.
  4. Change Transform > Rotation to X:0 Y:180 Z:0.

Click Play to test.

Recording and Playing Back Avatar Pose Updates

The Avatar packet recording system saves Avatar movement data as packets you can send across a network to play back on a remote system. Lets take a quick tour of the RemoteLoopbackManager script.

Open the RemoteLoopback scene in Assets > Oculus > Avatar > Samples > RemoteLoopback.

Set RecordPackets to true to start the Avatar packet recording system. Also, subscribe to the event handler PacketRecorded so that you can trigger other actions each time a packet is recorded.

void Start () {
    LocalAvatar.RecordPackets = true;
    LocalAvatar.PacketRecorded += OnLocalAvatarPacketRecorded;

Each time a packet is recorded, the code places the packet into a memory stream being used as a stand-in for a real network layer.

void OnLocalAvatarPacketRecorded(object sender, args)
    using (MemoryStream outputStream = new MemoryStream())
        BinaryWriter writer = new BinaryWriter(outputStream);

The remainder of the code receives the packet from the memory stream for playback on the loopback avatar object.

void SendPacketData(byte[] data)

void ReceivePacketData(byte[] data)
    using (MemoryStream inputStream = new MemoryStream(data))
        BinaryReader reader = new BinaryReader(inputStream);
        int sequence = reader.ReadInt32();
        OvrAvatarPacket packet = OvrAvatarPacket.Read(inputStream);
        LoopbackAvatar.GetComponent<OvrAvatarRemoteDriver>().QueuePacket(sequence, packet);

Loading Personalized Avatars

You can replace the default Avatar with a personalized Avatar. The base OvrAvatar.cs class is already set up to load the Avatar specifications of users, but we need to call Oculus Platform functions to get valid user IDs.

After getting a user ID, we then can set the oculusUserID of the Avatar accordingly. The timing is important, because we have to set the user ID before the Start() function in OvrAvatar.cs gets called.

Note: For security reasons, Oculus Avatars and Oculus Platform must be initialized with a valid App ID before accessing user ID information. You can create a new application and obtain an App ID from the developer dashboard. For more information, see Oculus Platform Setup.

The example below shows one way of doing this. It defines a new class that controls the platform. After modifying the sample with our new class, you are shown the personalized Avatar of the current Oculus user instead of the default Avatar.

  1. Import the Oculus Integration into your Unity project.
  2. Specify valid App IDs for both the Oculus Avatars and Oculus Platform plugins:
    • From the menu bar, click Oculus > Avatars > Edit Settings and paste your App ID into the field.
    • From the menu bar, click Oculus > Platform > Edit Settings and paste your App ID into the field.
  3. Add an OVRCameraRig and LocalAvatar to the scene as described earlier in this topic.
  4. Create an empty game object called <objectname>:
    • Click GameObject > Create Empty.
    • Rename the game object <objectname>.
  5. Click Add Component, enter New Script in the search field, and then select New Script.
  6. Name the script <filename> and set Language to C Sharp.
  7. Save the text below as Assets\filename.cs.

    using UnityEngine;
    using Oculus.Avatar;
    using Oculus.Platform;
    using Oculus.Platform.Models;
    using System.Collections;
    public class <classname> : MonoBehaviour
        public OvrAvatar myAvatar;
        void Awake () {
        private void OnInitComplete(Message<PlatformInitialize> message)
        private void GetLoggedInUserCallback(Message<User> message)
            if (!message.IsError) {
                myAvatar.oculusUserID = message.Data.ID.ToString();
  8. In the Unity Editor, select the game object you created from the Hierarchy. The My Avatar field appears in the Inspector.
  9. Drag LocalAvatar from the Hierarchy to the My Avatar field.

Press Play and the personalized Avatar should load. Depending on how your scene is set up, you may want to try viewing the Avatar from a third-person perspective as described in the “Adding an Avatar to the Scene” section to get a better look.

Handling Multiple Personalized Avatars

In a multi-user scene where each avatar has different personalizations, you already have the user IDs of all the users in your scene because you had to retrieve that data to invite them in the first place. Set the oculusUserID for each user’s Avatar accordingly.

If your scene contains multiple Avatars of the same person, such as in our LocalAvatar and RemoteLoopback sample scenes, you can iterate through all the Avatar objects in the scene to change all their oculusUserID values. Here is an example of how to modify the callback of our new class to personalize the Avatars in those two sample scenes:

using UnityEngine;
using Oculus.Avatar;
using Oculus.Platform;
using Oculus.Platform.Models;
using System.Collections;

public class <classname> : MonoBehaviour

    void Awake ()


    private void GetLoggedInUserCallback(Message<User> message)
        if (!message.IsError)
            OvrAvatar[] avatars = FindObjectsOfType(typeof(OvrAvatar)) as OvrAvatar[];
            foreach (OvrAvatar avatar in avatars) {
                avatar.oculusUserID = message.Data.ID.ToString();

Cross-Platform Avatar Support

The removal of dependencies on the Oculus runtime enables developers making multi-platform apps to use Oculus Avatars on any PC platform that can use the Avatar SDK. To see a demo of this functionality, see the Unity CrossPlatform sample included with the SDK.

For more information, see the Unity CrossPlatform Sample Scene topic.

Retrieve an Avatar’s Preview Image

You can retrieve an Avatar’s preview image for use in your app by making a server-to-server (S2S) API request. First, request a user access token through Users.GetAccessToken(), and then make an S2S call to https://graph.oculus.com/[USER_ID]?access_token=[USER_TOKEN]&fields=avatar_v2{avatar_image{uri}}, where [USER_ID] is the user’s ID and [USER_TOKEN] is the user access token. For more detailed information on S2S API calls, see Server-to-Server API Basics.

Avatar Prefabs

The Avatar Unity package contains two prefabs for Avatars: LocalAvatar and RemoteAvatar.

They are located in OvrAvatar > Content > PreFabs. The difference between LocalAvatar and RemoteAvatar is in the driver, the control mechanism behind avatar movements.

The LocalAvatar driver is the OvrAvatarDriver script, which derives Avatar movement from the logged-in user’s controllers and HMD.

The RemoteAvatar driver is the OvrAvatarRemoteDriver script, which gets its Avatar movement from the packet recording and playback system.

Ensuring Proper Lighting

Dynamic lighting of your Avatar ensures that your user’s Avatar looks and feels at home in your scene. The primary light in your scene is used to calculate lighting.

If you must have multiple real-time light sources, which is highly discouraged, then can set the primary light source in Unity’s lighting settings.

The _Cubemap texture is designed to work with reflection probes and applies the reflection according to the alpha channel of the roughness map.

Custom Touch Grip Poses

The GripPoses sample lets you change the hand poses by rotating the finger joints until you get the pose you want. You can then save these finger joint positions as a Unity prefab that you can load at a later time.

In this example, we pose the left hand to make it look like a scissors or bunny rabbit gesture.

Creating the left hand pose:

  1. Open the Samples > GripPoses > GripPoses scene.
  2. Click Play.
  3. Press E to select the Rotate transform tool.
  4. In the Hierarchy window, expand LocalAvatar > hand_left > LeftHandPoseEditHelp > hands_l_hand_world > hands:b_l_hand.

  5. Locate all the joints of the fingers you want to adjust. Joint 0 is closest to the palm, subsequent joints are towards the finger tip. To adjust the pinky finger joints for example, expand hands:b_l_pinky0 > hands:b_l_pinky1 > hands:b_l_pinky2 > hands:b_l_pinky3.

  6. In the Hierarchy window, select the joint you want to rotate.

  7. In the Scene window, click a rotation orbit and drag the joint to the desired angle.

  8. Repeat these two steps until you achieve the desired pose.

Saving the left hand pose:

  1. In the Hierarchy window, drag hand_l_hand_world to the Project window.
  2. In the Project window, rename this transform to something descriptive, for example: poseBunnyRabbitLeft.

Using the left hand pose:

  1. In the Hierarchy window, select LocalAvatar.
  2. Drag poseBunnyRabbitLeft from the Project window to the Left Hand Custom Pose field in the Inspector window.

Click Play again. You see that the left hand is now frozen in our custom bunny grip pose.

Grabbing Objects with Rift Hands

To let avatars interact with objects in their environment, use the OVRGrabber and OVRGrabbable components. For a working example, see the AvatarWithGrab sample scene included in the Oculus Unity Sample Framework.

Cubemap Reflections on Avatars

Certain elements of Avatars, such as sunglasses, reflect their environment using cubemaps as a source. The basic effect is automatic for supported Avatar elements, with the shader sampling the default skybox cubemap as a source for reflections.

For more control over reflections, you can add a Reflection Probe component to an Avatar to supply a different cubemap to the shader. All three Reflection Probe types (Baked, Custom, and Realtime) are supported, but there is an increased performance hit when using Realtime reflections.

Expressive Features

Expressive features give Avatars more advanced facial geometry that allows for realistic and nuanced animation of various facial behaviors. Expressive features increase social presence and make interactions seem more natural and dynamic. The following facial behaviors are available for Avatars:

  • Realistic lip-syncing powered by Oculus Lipsync technology.
  • Natural eye behavior, including gaze dynamics and blinking.
  • Ambient facial micro-expressions when an Avatar is not speaking.

For more information, see Expressive Features for Avatars - Unity.

Switching Between Transparent and Opaque Render Queues

The Use Transparent Render Queue option allows you to switch between the transparent and opaque render queues. By default, this option is enabled and the transparent queue is used. This must be set before runtime, and you must restart for changes to take effect. Using the opaque render queue eliminates the need for blending, and thus reduces overdraw.

Making Rift Stand-Alone Builds

To make Rift Avatars appear in stand-alone executable builds, you need to change two settings.

Add the Avatar shaders to the Always Included Shaders list in your project settings:

  1. Click Edit > Project Settings > Graphics.
  2. Under Always Included Shaders, add +3 to the Size and then press Enter.

Build as a 64-bit application:

  1. Click File > Build Settings.
  2. Set Architecture to x86_x64.

Testing Your Integration

Once you’ve completed your integration, you can test by retrieving some Avatars in-engine. Use the following user IDs to test:

  • 10150022857785745
  • 10150022857770130
  • 10150022857753417
  • 10150022857731826