At
Meta Connect 2022, we introduced
Eye Tracked Foveated Rendering (ETFR) for
Meta Quest Pro, a new graphics optimization that renders the high-pixel-density foveal region to match where the user is looking based on eye gaze, which results in significant GPU savings for developers. Since then, developers have integrated ETFR to improve the look of their games, including
Vertical Robot, the developers of
Red Matter 2, an award-winning VR game that continually pushes the boundaries of graphics on Meta Quest.
“Integrating ETFR (Eye Tracked Foveated Rendering) into
Red Matter 2 was a seamless process, with the activation being as simple as flipping a switch. Our team then focused on maximizing the pixel density, thoroughly testing the results in-game. The outcome was impressive, with a 33% increase in pixel density—equivalent to 77% more pixels rendered in the optical center. The combination of ETFR and
Quest Pro’s pancake lenses provides a remarkably sharp image. There is simply no going back from it. ETFR has truly elevated the gaming experience.” —Vertical Robot (
Red Matter 2)
As alluded to by Vertical Robot, we’ve designed ETFR to be easily integrated in both
Unity and
Unreal. Dive in below to learn more about how ETFR works, and find best practices and FAQs to help you optimize your app’s graphics for Meta Quest.
What Is Eye Tracked Foveated Rendering?
As you may already know,
Fixed Foveated Rendering (FFR) renders full pixel-density in the center of the screen (a.k.a. the foveal region) and low pixel-density around the edge of the screen. ETFR, on the other hand, moves the foveal region around to match where you’re looking using eye tracking. Because the low pixel-density region is hidden in your peripheral vision at all times, we can apply more aggressive foveation maps based on eye gaze, which results in more GPU savings compared to FFR.
How Does It Work?
Overview
ETFR works by first computing the user's gaze direction from the eye camera images. The gaze direction is then used to move the high-pixel-density foveal region around in the app rendering pipeline.
In order to do this, our Unity and Unreal integration implements a new Vulkan extension: Tile Offset (
VK_QCOM_fragment_density_map_offset). Developed by our Qualcomm partners, Tile Offset allows the same foveation map to be re-used and smoothly moved around by specifying a pixel offset value. As shown in the animation below, Tile Offset gives much finer control when moving the foveal region around, and avoids tile flickering artifacts in the traditional method where we update the fragment density map every frame. Learn more about Tile Offset in
Qualcomm’s blog post.

With Tile Offset

Without Tile Offset
API Architecture
Our eye tracked foveation OpenXR API is extended from the existing
XR_FB_foveation framework, which is used by runtime-driven fixed foveated rendering we shipped years ago. Our Unity and Unreal integration calls the APIs under the hood so you won’t need to deal with the details. Native VR apps can follow the same steps to support eye tracked foveated rendering.
Latency Optimization
In the ETFR pipeline, the camera sensors and processing are running at the same frame rate as the display and are fully synced to the rendering pipeline in order to minimize the latency between eye movements and the display time. In particular, we’re offsetting the eye tracking camera capture time in order to minimize the gap between “eye tracker producing a new gaze output” and the “gaze output being used to update the foveation map” as highlighted in the red box below.
The end-to-end pipeline latency measured in our UE4 test app is ranging from 46 to 57 ms depending on the current rendering load. Along with eye tracking gaze prediction, we’re able to get a decent visual experience for all kinds of eye movements. If you’re reading a similar ETFRLat
metric in the logcat (adb logcat -s VrApi
), you should get a comparable result.
Limitations and Best Practices
ETFR is a powerful tool, but it’s not a silver bullet and may not be suitable for every type of app. ETFR helps a lot if your app is fragment-heavy (e.g. your app uses complex materials or renders at a very high resolution) because it results in fewer fragment shading executions for a given render pass. Because of this, the tool won’t help much if your app is vertex bound. Also, keep in mind that ETFR currently only applies to your eye buffer(main layer), which means that if you use additional compositor layers, the GPU cost of rendering and compositing those layers will not be improved with ETFR—only the cost of your eye buffer will be reduced.
Ultimately, you should decide how and when to use this tool and make sure to thoroughly test your app. Just like FFR, the low pixel-density peripheral region can cause flickering, especially for high contrast and complex geometries. The maximum foveation level should be carefully chosen balancing performance and visual quality.
Using The Extra GPU Savings
We recommend using the extra GPU gain from ETFR to increase render target size for crisper and shaper images. Here’s an example of the GPU render time for FFR / ETFR rendered at both default and increased eye texture size in our UE4 test app:
As shown in the table above, if your app today is rendering using FFR-3, you could try switching to ETFR-2, with an increased eye texture resolution and MSAA-2 for a sharper image at around the same GPU time cost.
The GPU savings you’ll get from ETFR will depend on your app’s content, so make sure to profile your app at different foveation levels and render target sizes in order to determine which combination works best for you.
Dynamic Foveated Rendering
We also recommend turning on “Dynamic Foveated Rendering” so that the foveation level is adjusted automatically based on the current GPU rendering load. The maximum foveation level can be specified by “Foveated Rendering Level.” Developers should make sure to fully test the rendering quality (e.g. not too many visual artifacts in peripheral vision) at the maximum foveation level before enabling this option.
Full Developer Package
ETFR is available today for developers in SDK v49 for both
Unity and
Unreal Engine. All the details to get started can be found in our documentation (
Unity /
Unreal).
Final Thoughts
ETFR primarily reduces the pixel throughput and can potentially enable high PPD (pixel per degree) rendering for future HMDs. Ultimately, you’ll need to make a trade-off between performance savings and visual quality by balancing your app’s graphics complexity, resolution, MSAA, and foveation levels.
We’re committed to supporting you with easier ways to optimize graphics and improve performance, and releasing ETFR is just the beginning of this journey. We look forward to hearing your feedback, bug reports, and questions so we can make our tools more effective. If you have questions, check out the FAQs below.
FAQs
Is OpenXR required for ETFR?
Yes, we only support ETFR under OpenXR, since we’re fully committed to supporting OpenXR APIs moving forward. For example, in UE4, make sure that “OpenXR OVRPlugin” is selected under the “XR API” setting.
Is Vulkan required for ETFR?
Yes. Our UE4/Unity ETFR integration is only supported for apps using Vulkan + Multiview. Due to the lack of fragment density map and new Tile Offset extension support, ETFR won’t work in GLES. Thankfully, our UE and Unity Vulkan integrations are very stable and performant, so we simply recommend all Quest UE and Unity developers use Vulkan for their Quest apps.
Is ETFR an “all-or-nothing” feature?
No. You can turn on / off ETFR or switch to FFR at any time in the game engine scripting level. For example, you can try to turn off ETFR completely in the game’s main menu and only turn on ETFR when entering the gaming scene.
How is the eye tracking permission handled?
The very first time a user opens an app with ETFR, the system will display a permission prompt requesting the user to accept the eye tracking permission. If the permission is declined, the app will fallback to FFR rendering—however, you can choose to handle the fallback logic differently in the game engine scripting level.
Note that at any time, users can also choose to pause eye tracking in the universal menu, disable eye tracking in the system settings, or revoke eye tracking permission.