Eye Tracking for Movement SDK for Unity
Updated: Dec 20, 2024
Eye tracking technology detects eye movements to control an avatar’s eye transformations as the user looks around. The Meta Quest Pro headset is the only device that supports this feature, utilizing the
OVREyeGaze
script.
Note: If you are just getting started with this Meta XR feature, we recommend that you use
Building Blocks, a Unity extension for Meta XR SDKs, to quickly add features to your project.
The
OVREyeGaze
MonoBehaviour component provides eye tracking or gaze information. It retrieves eye pose data from the OVRPlugin in tracking space. When you add the
OVREyeGaze
component to a GameObject, it can simulate an eye, updating its position and orientation based on actual human eye movements. This component enhances the expressiveness of both realistic and stylized characters and can select objects in a scene using raycasts. If you do not setup the necessary eye tracking permissions, tracking will not occur.
Your use of the Eye Tracking API must at all times be consistent with the
Oculus SDK License Agreement and the
Developer Data Use Policy and all applicable Oculus and Meta policies, terms and conditions. Applicable privacy and data protection laws may cover your use of Movement, and all applicable privacy and data protection laws.
In particular, you must post and abide by a publicly available and easily accessible privacy policy that clearly explains your collection, use, retention, and processing of data through the Eye Tracking API. You must ensure that a user is provided with clear and comprehensive information about, and consents to, your access to and use of abstracted gaze data prior to collection, including as required by applicable privacy and data protection laws.
Please note that we reserve the right to monitor your use of the Eye Tracking API to enforce compliance with our policies.
When a user enables eye tracking for your app, your app is granted access to real time abstracted gaze data, which is user data under the
Developer Data Use Policy. You are expressly forbidden from using this data for
Data Use Prohibited Practices in accordance with the Developer Data Use Policy. The eye tracking feature is powered by our Eye Tracking API technology.
Learning Objective
After completing this section, the developer should be able to:
1. Set up a new project for eye tracking.
2. Enable a character to support eye tracking.
Set up a project that supports eye tracking After you have configured your project for VR, follow these steps.
- Make sure you have an OVRCameraRig prefab in your scene. The prefab is located at
Packages/com.meta.xr.sdk.core/Prefabs/OVRCameraRig.prefab
. - From the OVRCameraRig object, navigate to the OVRManager component.
- Select Target Devices.
- Scroll down to Quest Features > General.
- If you want hand tracking, select Controllers and Hands for Hand Tracking Support.
- Under General, make sure Eye Tracking Support is selected. Click General if that view isn’t showing.
- Under OVRManager, select Eye Tracking under Permissions Request On Startup.
- If your project depends on face tracking, eye tracking, or hand tracking, ensure that these are enabled on your HMD. This is typically part of the device setup, but you can verify or change the settings by clicking Settings > Movement Tracking.
- Fix any issues diagnosed by the Project Setup Tool. On the menu in Unity, go to Edit > Project Settings > Meta XR > to access the Project Setup Tool.
- Select your platform.
- Select Fix All if there are any issues. For details, see Use Project Setup Tool.
Setting up a character for face tracking - Choose the GameObject that will represent your character’s eyeball.
- Attach the
OVREyeGaze
component to it. - Set the component’s reference frame, specify the eye (left or right), and set the confidence threshold.
- Enable Apply Rotation to allow real-time rotation of the eye GameObject as it tracks the corresponding eye. Optionally, enable Apply Position to allow positional adjustments.
This component needs a reference frame in world space orientation to function correctly, typically aligned with the eye’s forward direction, to calculate the eye GameObject’s initial offset.
The key attributes include
Confidence Threshold and
Tracking Mode. If the eye tracking data falls below the set confidence threshold, the
OVREyeGaze
will not apply the data to the GameObject.
Tracking modes include:
- World Space: Converts eye pose from tracking to world space.
- Head Space: Converts eye pose from tracking to local space relative to the VR camera rig.
- Tracking Space: Uses raw pose data from VR tracking space.
Do I need to apply correctives for eye tracking?
No, correctives are generally not necessary, although calibration through the Meta Quest Pro OS might be required for enhanced accuracy.
How does this work with realistic or stylized characters?
The
OVREyeGaze
can animate any character’s eye. Stylized characters with larger eyes may show more pronounced movements.
Is Eye Tracking available on both Meta Quest Pro and Meta Quest?
Currently, only the Quest Pro supports Eye Tracking.
Can I use eye tracking to emphasize specific scene areas?
Yes, using raycasting driven by the eye transform’s forward direction can highlight areas of interest.
What if the user denies Eye Tracking permission?
If permissions are not granted, the
OVREyeGaze
will not control the eye.
Does eye tracking provide confidence values?
Yes, the
OVREyeGaze
includes a
Confidence field that ranges from 0 to 1, with higher values indicating greater reliability.