This guide reflects the current state of how Meta Quest performs color management in Meta Quest headsets. Even though it includes a brief introduction to color science, much of the information shared assumes some familiarity with the subject matter. As such, this guide does not provide a detailed deep dive on color science and color spaces, but it does expand on areas that are crucial to our Meta Quest-specific use cases.
Background
The Meta Quest line of products has grown steadily over the years, and with the growth came the need for a consistent developer experience for display color reproduction. The original decision to use display panels from multiple vendors on Go, each with its own similar but visibly different color spaces, solidified this requirement. Otherwise, the same content would look vastly different from headset to headset. Once this color space management pipeline was established, it proved to be useful in subsequent products including Meta Quest, Rift, and Link.
The most important thing to understand about color management is that RGB tuples like (72,195,44) or (159,37,205), are not well-defined color values. This might seem like a pedantic comment, but consider that the color you see on a screen is formed from the combination of red, green, and blue primaries separated into subpixels, each of which has its own chromaticity and varies somewhat from device to device and pixel to pixel. Maximum (255,0,0) red on one screen may look substantially different from another screen showing maximum red. A well-defined color is a physical property of photons that is a function of how the average wavelengths stimulate the viewer’s retina.
The great benefit of light being a physical quantity is that we can measure the response of the color primaries of a panel and adjust how we drive them to achieve a more accurate representation of the intended color. To do that we need to specify what those intended colors are. This specification is called a color space. For nearly 25 years, the standard color space used by the industry for computer monitors (and by extension, most devices with color screens) has been the International Telecommunication Union (ITU) Recommendation BT.709, shortened typically to Rec.709. Although there are subtle differences, you sometimes see sRGB used interchangeably with Rec.709 as they share the same color primaries. sRGB was designed to be a relatively easy manufacturing target for CRTs and later adopted for LCDs, so that the color masks of computer monitors could be somewhat consistent across the board. Rec.709 and sRGB sacrifice maximum color saturation to get there, however, and so many color spaces have been devised to expand the range of colors that can be shown, particularly for photography and cinematography. For instance, Adobe RGB is identical to Rec.709 except for a much more saturated, greener green to better cover the range of colors on screen that can be printed by a CMYK photographic printer. The DCI-P3 color space increases saturation on green relative to Rec.709, but also dramatically deepens the redness of the red channel to help cinematographers work with those deep tones. One of the more important new color spaces is ITU’s Rec.2020, which specifically uses pure wavelengths that can be generated from lasers to define the red (630 nm), green (532 nm), and blue (467 nm) primaries.
In addition to primaries, color space specifications define a whitepoint which represents what chromaticity the response to maximum drive to the primaries should be. Rec.709 and Rec.2020 both use the D65 standard, which is approximately the apparent color of a black body radiator at about 6500 degrees Kelvin. This whitepoint is a major point of frustration as we explain later.
Display Specifications for Meta Quest Devices
Note: You’ll see a recurring theme with the Meta Quest display whitepoint specifications. Even though color standards such as Rec.2020 and Rec.709 are specified with a whitepoint of D65, every single headset we have shipped as of 2021 has used displays factory calibrated to D75. This started with the Rift CV1 and for consistency, all subsequent display selections also use D75.
Oculus Go
The first standalone Oculus Go device to get color space management was Go, which sourced display panels from two separate vendors, each with their own color spaces. Both panels have a native whitepoint close to D75, which has a bluer tone than D65. While one vendor’s primaries are nominally Rec.709, the other vendor’s green and red primaries shift towards yellow. Without correction, this shift, combined with the bluer whitepoint, gave skin tones a noticeable sickly pallor. To balance the color spaces and bring the panels back in line with existing standards, the default color space transformation on Go is to map applications as Rec.709 with D65 whitepoint.
Go Native Color Space
Meta Quest (Quest 1)
Meta Quest uses OLED panels which due to the very pure colors achievable from LEDs has much more saturated color primaries than Rec.709, particularly in blues and greens. The color space is somewhere in between Adobe RGB, DCI-P3, and NTSC 1953 color spaces. When compared to Rec.2020, the red primary of the Quest OLED falls a bit short, but is closer for green and blue primaries.
Meta Quest 1 Native Color Space Comparison
OLED chemistry isn’t very consistent, so the primaries and whitepoint can drift somewhat from panel to panel. Unlike with Oculus Go headsets where we use manufacturer specified color space data, the color response of every Meta Quest panel is measured at the factory individually using laboratory colorimeters so that we can better match color performance between the left and right panels.
Meta Quest 1 Native Color Space Variation
As with the Go panels, Meta Quest panels have a whitepoint closer to D75 than D65 which results in a blue cast on content relative to what would be seen on a calibrated monitor without correction.
Meta Quest 1 Native Whitepoint Comparison
Meta Quest 2
Similar to Go, Meta Quest 2 uses LCDs panels, but unlike the Go panels that didn’t quite fit the Rec.709 color space, the Quest 2 displays closely follow the Rec.709 color space’s RGB primaries while still using a whitepoint that is very close to D75.
Rift CV1
Similar to the Meta Quest 1 OLEDs, Rift CV1 uses OLED panels which can slightly vary between different units. While there are minor differences between the Rift CV1 and Quest 1 panels, they are negligible. We originally never intended to perform color space conversions for the Rift CV1 and only correct for various visual artifacts such as luminance variation across the display at different levels. So while on average we converged to similar results on Rift CV1, different units can show differences that vary more than what we get on the Quest 1 output.
Rift S
Similar to Go and Meta Quest 2, Rift S uses LCD panels that provide fairly accurate results. The panels used in all production units are accurate Rec.709 displays but again use the D75 whitepoint. For the PC VR runtime, Rift S was the first HMD where we employed color space conversion to make sure the content looked just as saturated as it did on the Rift CV1’s OLED panels. We discuss this further in the Color Correction Pipeline section.
Scope of Color Space Correction in VR
A color space normally defines a slew of aspects of the viewing experience. These include the display’s gamma curve (or transfer function), frame rate, luminance, resolution, viewing conditions such as brightness of the room, and so on. Some of these specifications do not apply to VR at all, while others such as gamma curves are handled separate from the color gamut as it applies to this article.
To that end, our color space management pipeline only handles the following aspects:
Remapping R, G, B color primaries and whitepoint from the source VR app’s color space to the native display’s color space.
Normalizing brightness to eliminate potential stereo-luminance disparity across the left & right eyes.
What about gamma correction?
Gamma correction is considered outside the scope of this article because it’s not handled explicitly by the color space pipeline, but directly by the GPU when reading and writing to texture buffers or render targets. Still, let’s briefly touch on it to get it out of the way.
Although actually defined as a color space, sRGB is better known among real-time graphics devs as a gamma curve than a color space even though the standard defines more specs as we briefly mentioned above. Rec.709 and sRGB share the same color primaries and we want to avoid confusion in our discussions when talking about color primaries in comparison to gamma curves. To that end, when talking explicitly about color primaries and not gamma curves, we say “Rec.709”. When referring to the specialized gamma curve defined by the standard, we say “sRGB.” In reality, Rec.709 also has its own specialized gamma curve, but as it’s not used for computer imagery, few VR developers should care about that.
The factory-calibration specs of our HMD displays are straightforward. All consumer HMDs that Meta has shipped as of 2021 either use an sRGB or a gamma 2.2 curve. Although sRGB and gamma 2.2 are extremely similar, they’re not exactly the same. In most cases the difference is negligible, but in some very specific cases this difference becomes evident. On the PC VR runtime we make sure that the content which might use sRGB gamma is accurately corrected to use gamma 2.2. However that means we run explicit shader math to do the conversion. On the Meta Quest runtime, the hardware doesn’t have the processor cycles to spare for this seemingly-minute difference. So it treats the display gamma as sRGB leading to some subtle differences in the darkest luminance ranges.
We expect that all rendered content sent from the VR app to the VR compositor is either encoded using 8-bit sRGB or a floating-point format. For example, R11G11B10F, R16G16B16A16F and so on. When the texture format is set correctly, the GPU performs the conversions when reading from and writing to these buffers making sure all of the shader math and texture samples are treated linearly. Keep in mind that GPUs only know how to natively handle the sRGB gamma curve and floating-point formats which are inherently compressed in a manner similar to gamma curves. Trying to roll your own gamma compression, such as gamma 2.4 which is similar to the Rec.709 standard, can be painful in various ways and is outside the scope of this guide.
For more information on the intricacies of the sRGB gamma curve and on handling gamma including the Oculus Rift pipeline, see The sRGB Learning Curve.
What about display-brightness correction?
Color spaces such as DCI-P3 and sRGB are defined to have ideal maximum display brightness specifications. For example, sRGB is expected to have a screen luminance level of 80 nits. However that is also defined based on how bright the viewing environment is expected to be, namely 200 lux. While some modern TVs have sensors to gauge the room’s brightness and adjust their luminance accordingly, this doesn’t really apply to VR as the VR user is typically in a light-locked HMD. This is why we tend to ignore screen-brightness related standards. Instead, we compare each eye’s display’s max white (255,255,255), and use the luminance of the darker side as the normalization factor. This allows us to normalize the luminance we achieve in each display presented to each eye making sure we don’t accidentally show luminance disparity that would lead to stereo discomfort in VR.
For example, say the left eye display shows a max luminance of 100 nits, while the right eye shows max luminance of 90 nits. We would scale down the luminance of the left eye by making sure the max luminance is also 90 nits. To do this, we scale all the colors presented to the left eye by 90% in linear space—before gamma correction. So when the VR app requests to show (255,255,255), it ends up showing (243,243,243) instead. Note: 243 is not 90% of 255, but it is 90% after gamma correction, assuming gamma ~2.2 (that is, 0.9^(1/2.2) * 255 = 243). This 90% scale factor is baked into the Color-Correction Matrix, so it comes at no extra real-time cost in the pipeline.
Color Space Correction Pipeline in VR
Although the PC and Mobile VR runtimes evolved separately, over time we made sure they converged to have similar color space correction pipelines with similar looking developer-facing APIs. The various custom corrections we perform for Rift CV1, Rift S and Quest 1 such as mura correction, screen-space uniformity correction, and so on are outside the scope of this guide.
The heart of the color space correction pipeline in the VR runtimes is the 3x3 Color Correction Matrix (CCM) which can remap a given set of RGB color primaries and whitepoint to a set of new color primaries and a new whitepoint. We use the CCM to convert a source image’s color space to the target display’s native color space where we defined the terms as:
Source color space: The VR app’s authoring, mastering, working color space defined by the default color space in the VR runtime or explicitly set by the VR app using provided API calls.
Target color space: The display’s native color space. That is, the physical display’s characteristic color space, sometimes known per HMD instance but sometimes only known as an average value for a given HMD model.
To avoid confusion in this guide, we will keep differentiating between source and target color spaces.
High-Level Color Space Correction Flow
These flow diagrams show the high-level stages of initialization and execution for a sample app that might use the DCI-P3 color space and run on a Quest 2 HMD.
Color Correction Flow
The VR compositor is distributed as part of the runtime on both the PC and standalone versions of our HMDs. The VR app is usually the third-party application that makes various SDK calls to talk to the VR runtime which houses the VR compositor. The VR app can also be a first-party app such as Meta Quest Horizon which renders the main home environment, and VR Shell which is the GUI where users can view the Store.
The VR app can never directly output imagery to the VR display’s back or front buffer. It has to route the images through the VR compositor, which is the stage that applies, among other things - lens-distortion correction, time warp, and color correction.
Source Color Space
The PC and Quest runtimes are similar in how they deal with the source color space, which is effectively the color space the VR app is rendering its content for. The source color space is sometimes referred to as the mastering or authoring color space because in theory the artists authoring the source assets should be using workflows that allow them to vet their assets—be it textures, videos, photos, or images—using this color space on their desktop monitors. This is to help prevent unexpected results when the asset is finally used in the VR app and rendered to the frame.
Once the source assets are picked up by the VR app to be used in real-time rendering, the shaders should not have to worry about applying yet another color space conversion as that would be wasteful. If the VR app is a video or a photo viewing application, then the color space choice is more straightforward because the color space is designated by the video or the image format’s color space. Almost all web content is expected to be in sRGB color space unless otherwise stated. This is also the expectation in common image formats such as JPEG, PNG, and GIF. That said, the color space of an asset is whatever the artists define it to be. As long as the rest of the pipeline treats it that way, then all is good.
The one way the PC and Quest runtimes differ has to do with the default color space each runtime uses until the VR app explicitly selects one. For more information, see “What do we mean by ‘Default Color Space’?”.
Target Color Space
The chromaticity tolerances of the display manufacturers are usually much higher than what we’d like them to be, especially on OLEDs and when sourced from different display panel vendors. This is where we stretch our muscles to achieve better color fidelity than what the display vendors can provide.
One major aspect of our color space correction algorithm is its ability to handle panel-to-panel color space variation ensuring that the same app looks very similar on different units. We do this by capturing the color response of each HMD display panel on the factory floor before it’s placed into the HMD enclosure and sealed shut.
The Rift CV1 is unique because it was developed before we introduced the color-correction pipeline APIs. Since we didn’t intend to do color space correction for Rift CV1 in the classic sense, we never extracted the data for those HMDs. So for Rift CV1, the target color space is simply the average color space we gathered from doing manual color space measurements on a number of different units prior to release.
Starting with Rift S, and Quest 1 HMDs however, this pipeline became more standardized. Each HMD has a unique color response file that is parsed and used when the VR runtime is initialized. This color response data is used to help build a CCM that is unique to the HMD in use. To get a better understanding on how we generate the CCM, see Color Space Mapping.
We provide the ovr_GetHmdColorDesc and vrapi_GetHmdColorDesc SDK calls (for PC and Quest respectively) which return an average color space for a given HMD. Going forward, we are also providing a corresponding OpenXR extension for this very purpose (see Further Reading). We do not disclose the per-HMD-measured color spaces directly to the VR app developer. This is for two reasons. It helps us return a simple color space enum value in our list of color spaces. If we were to return a more detailed definition of the color space, we would have to provide a number of parameters that complicates the API. We also do not want to cause further confusion for our developers as we would end up returning a slightly different value for each HMD.
Source Color Spaces We Offer in our APIs
In both the PC and Quest runtimes, we offer the same set of predefined color spaces. While we could allow an expert user to define their own CIE-xy coordinates as a custom source color space, we have opted to avoid this as it complicates matters and is too complex for most VR app devs to utilize. This also helps keep the conversion vetted within the parameters we determine as opposed to open-ended scalar conversions the VR app devs might decide to push into the runtime.
The current list as of 2021 is as follows and is also documented in the developer documentation topic Set Color Space:
Color Space
Description
Unmanaged
This value tells the VR compositor that no color correction should be applied. Using this value is not recommended for regular VR apps and is only suggested for research and testing as it causes apps to look different when using different HMDs. The VR compositor achieves this by keeping the CCM as an identity matrix, effectively passing the colors through to the display and bypassing any of the corrections we would normally bake into the 3x3 CCM including per-HMD color space adjustments and stereo-luminance disparity corrections.
Rec.2020
Rec.2020 color space using a D65 (daylight) whitepoint. Using this color space requires a good understanding of how to perform Rec.2020 mastering. Some pitfalls are described in What do we mean by Default Color Space.
Rec.709
Rec.709 color space using a D65 white-point. Same as sRGB for RGB color primaries and whitepoint.
Rift CV1
This is Rift’s unique color space. It is the preferred and default color space for the PC VR runtime even when using the Rift S. It uses the following average specifications: Approximately between Adobe RGB and DCI-P3 gamut, D75 whitepoint CIE 1931 xy color-primary values: Red : (0.666, 0.334) Green: (0.238, 0.714) Blue : (0.139, 0.053) White: (0.298, 0.318)
Rift S
Rift S’s unique color space is Rec.709 (sRGB) for the RGB color primaries and D75 for the whitepoint.
Meta Quest
Meta Quest’s unique color space ha slight diffrences from the Rift CV1 color space. CIE 1931 xy color-primary values: Red : (0.6610, 0.3382) Green: (0.2281, 0.7178) Blue : (0.1416, 0.0419) White: (0.2956, 0.3168)
P3
Also called the DCI-P3 color space, it uses a D65 whitepoint.
Adobe RGB
Similar to sRGB, but with deeper greens using D65 for its whitepoint.
Note that we currently do not offer a unique “Quest 2” color space mainly because it is practically identical to the Rift S color space.
What do we mean by Default Color Space?
When we refer to the default color space, we are specifically referring to the source color space the SDK assumes for the active VR app. The VR compositor assumes a default color space for the VR app until the VR app optionally designates one using the provided SDK API. This default color space choice used to differ between PC VR and Quest runtimes, but over time we modified things to help converge them to a unified solution.
History Behind PC VR Runtime’s Default Color Space
The Rift S was the first HMD to make use of color space correction in the PC VR runtime. Prior to the Rift S, the only PC HMD we fully supported was the Rift CV1. When we shipped the Rift CV1, since the HMD shipped with an OLED panel that had a wider color space than the standard sRGB color space, we avoided color space conversion on purpose for the following major reasons:
Almost all PC games are authored for the sRGB color space. If we treat incoming content as sRGB, and limit our color gamut by doing an sRGB to Rift CV1 color space conversion, then we have effectively thrown away a good portion of the color space gamut that makes our OLED panels shine with deep saturated colors.
If we treated all PC VR content as “authored for sRGB unless specified otherwise”, then in our opinion very few VR developers would know or bother to flip this switch to designate a wider color space even though they might want to use it. This is especially problematic with games shipping with SteamVR that have an “unknown” color space since SteamVR doesn’t take this into account when rerouting a SteamVR app’s eye buffers to the SDK. This could mean the extra color gamut the OLED offers would go to waste.
We believe sRGB content on a wider-gamut OLED display like the one in the Rift CV1 actually looks good in almost all cases and has become the expected output saturation from the consumers’ point-of-view. In our experience, most developers don’t like the limited sRGB color space, and most customers already use desktop displays where regular sRGB content is expanded to a wider color space like DCI-P3.
If we were to treat VR-app developer’s intention to use sRGB as “accurate at all cost”, then we would most likely have to add a new vivid/vibrant color mode so that our displays don’t look as washed out similar to how modern TVs provide such options.
While oversaturation resulting from sRGB being treated as DCI-P3 is usually appealing, it of course has its limits. For example treating sRGB colors as if they are Rec.2020 will show an extreme amount of oversaturation leading to content looking obviously wrong, where it may cause certain skin tones to look sunburnt. On top of that, since Rec.2020 uses D65, when D65-to-D75 whitepoint conversion is also folded into the CCM, then gray tones will visibly push to orange/peach tones. See the following comparison image which shows the results when sRGB content is treated as if it was authored for Rec.2020 usage.
Example of Bad and Good Color Space Usage Top: sRGB content treated as Rec.2020, Bottom: Proper sRGB output of same content
However, once we started working on Rift S, it became clear that we needed a form of color space conversion. The Rift S display is very close to sRGB, and as we said above, most content is authored for sRGB. So even though passing content directly through untouched would technically be the more accurate thing to do, the native color space of Rift S (Rec.709) looked washed out compared to Rift CV1. So from then on, on the PC VR runtime, we decided to treat the VR app’s color space as if it was authored for the Rift CV1 HMD’s color space even though we knew that very few developers were actually doing any color space-related authoring. Once we shipped Rift S, mostly thanks to this color space conversion treatment, the positive customer feedback touting that “the Rift S display looks vibrant for an LCD” helped seal the deal. Thus when Rift S shipped, the color space conversion from source Rift CV1 to target native Rift S was the only color space conversion we supported.
Note that as the standard whitepoint, source content on PC was theoretically authored for D65. Since both the Rift CV1 and Rift S panels have the same D75 whitepoint, we also consciously didn’t perform any D65 to D75 conversion as content was assumed to be in Rift CV1 color space. Whitepoint never became a concern as we believe users’ eyes quickly adapt to different whitepoints, especially in VR where the display is all the user can see and cannot refer to physical light sources around them. However, if you’ve felt that the content on your Oculus PC HMD has always felt a bit blue, this D75 default whitepoint would be the reason.
Over time, in an effort to unify our color space correction pipeline with standalone (Quest) VR, we introduced similar color space standards into the PC VR runtime that the standalone VR runtime already offered. This was to help make sure that developers using the PC VR runtime when developing their Unity or UE4 app could switch over to a native Quest build and see very similar results. This meant we had to replicate the behavior we had on the standalone VR runtime side to exist also on the PC VR runtime. When doing this however, we kept the Rift CV1 color space as the same default color space on our PC VR runtime.
History Behind the Standalone Runtime’s Default Color Space
The color space correction pipeline was originally introduced for Oculus Go to address the color space differences between different vendors. Since Go was roughly calibrated to Rec.709, that was the recommendation for all developers to target. This worked nicely and there was much rejoicing.
However, once this technology moved over to Meta Quest, we realized that the wider, non-standard color space was going to potentially cause unexpected results for developers just starting to develop apps for the Meta Quest HMD. Addressing this was going to require some outside-the-box thinking. In an effort to be future-facing and compatible with all future HMD color spaces, we decided to use the widest standardized color space. So Rec.2020 was chosen. Over the years, we’ve realized there’s been room for improvement in our communication around this as it’s been a source of pain for developers. Many have had to reach out to developer support trying to figure out why the content looks so saturated compared to their Rift build. This was made even more painful as the color space choice of Rec.2020 was a hard requirement with no option for developers to pick a different color space.
Over time, our developer relations and engine integrations teams have shared an undocumented workaround which required developers to write a special line into the Android manifest file to declare a different source color space for their VR app. This started to get shared more and more widely, making it obvious that developers either didn’t want to deal with Rec.2020 or didn’t know how.
In an effort to remove this pain point, we:
Introduced a new API that made it easier for developers to switch the source color space.
Addressed the most common paths by making Unity and UE4’s default source color space choice to be the same on PC and standalone, that is, the Rift CV1 color space.
Made the new OpenXR API use the Rift CV1 color space as the default choice.
As much as we want to change the default color space in the VrApi from Rec.2020 to Rift CV1, doing so would end up retroactively affecting apps that might actually be properly authored for Rec.2020. We don’t have a clear picture of which apps are using Rec.2020 on purpose or unintentionally. We could, in theory, decide to switch the default color space for any Quest apps shipping after SDK version X, but for now we’ve decided such a change is not worth the hassle.
There are likely a number of Quest apps using the default Rec.2020 color space choice without knowing they are. To date, we have reached out to developers who obviously had this problem in their apps, but we cannot guess the artistic intention of each app’s visual direction. To that end, we opted to provide this guide to help developers be aware of the issue and make that informed decision for their own apps.
How Does Link Deal with Color Spaces?
Link makes use of both PC VR and Quest color correction pipelines. When a Link session starts, the HMD requests the color space it wants the PC compositor to use. This color space is selected to be as close to the HMD’s final color space as possible without sending over the custom CIE-xy coordinates for that particular HMD display.
For Quest 1, the PC compositor color space is Rift CV1 which was the closest color space to the Quest in 2019 when Link first shipped, before we introduced the new color space APIs.
For Quest 2, the PC compositor color space is Rift S color space which is practically the same color space as Quest 2.
Once the PC-composited and color space-corrected images are sent over to the Meta Quest HMD, the compositor performs the appropriate conversions from the PC compositor’s output color space to the Quest’s native-display color space using the display panel’s unique color primaries. Performing the majority of the color space correction before we feed it into the video compression pipeline for Link helps minimize the amount of banding we would otherwise see in the images as it prevents major divergence between what the PC outputs as compared to what the Meta Quest uses.
Color space Correction Algorithm
This section discusses the heart of the color correction algorithm which includes the high-level linear algebra used to generate our CCM (Color Correction Matrix) as well as what we do in the real-time pipeline.
Color space Mapping
The first step in color space correction is mapping the RGB tuples in both the source and destination color spaces back to an absolute coordinate space where the three dimensions fully cover the wavelength range visible to humans, roughly 400 nm to 700 nm. In this case, we’re using the CIE XYZ coordinate system which is an industry standard. To do this, we measure the XY chromaticities of the panels at full red, green, blue, and white, and then generate a conversion matrix that transforms from RGB tuples to the XYZ tristimulus values.
An RGB➔RGB conversion matrix that converts from the application color space to the panel color space is easily generated from the source and destination color space matrices generated above. A naive color space conversion can then be performed by multiplying RGB values by the source color space conversion matrix and then the inverse of the target color space conversion matrix.
This matrix looks okay as long as the whitepoints of both the source and target color spaces match, but when they differ, shades of gray appear incorrect.
Whitepoint Adaptation
To handle the whitepoint shift, we introduce another intermediate color coordinate system, LMS, which better matches the long-, medium-, and short-wavelength cone responses of the human visual system. This is done using the Bradford transformation matrix, shown here as “M sub-A”, and its inverse converts back to XYZ.
The LMS values of the source and destination whitepoints are calculated and the actual adaptation matrix is generated with a simple component-wise scale matrix in the LMS coordinate space, as shown below.
Combined Transformation (and Caveats)
The color space mapping and white-point adaptation matrices are combined as shown below and this is the conversion matrix used currently on Meta Quest standalone devices.
Reducing the conversion to a single matrix multiply, for application RGB values, minimizes the ALU cost in the TimeWarp fragment shader, but it is not capable of handling non-linear effects like clipping at the maximum drive levels of the individual color channels. Bright gray colors may clip on the green channel or appear to have overdriven reds because of the whitepoint adaptation. Targeting a D75 whitepoint instead of D65 would alleviate this issue, but would violate the purpose of moving to standard color spaces, and content that was not authored with that whitepoint in mind would look substantially different. An error-correction term could be added that would clamp down overdriven channels and pull them toward the expected whitepoint, but at the cost of added ALU load.
Color Space Correction Application
Real-time Correction
Earlier we mentioned that the CCM is not capable of handling non-linear effects unless we want to incur more real-time ALU cost on the GPU. Let’s expand on that before talking about the real-time application of the CCM.
While the CCM is fairly efficient as a means of performing color space conversion in real-time, it’s not perfect, especially when dealing with all the ways a given display can stray from the ideal color space and gamma curves. It’s not hard to see why as the calculation to create the CCM only pulls a select number of measured values provided by the HMD which are the max R, G, B, W values. On the Quest HMDs, other measurements are not utilized.
While a detailed explanation is outside the scope of this article, when running a Rift S HMD, the PC VR runtime makes use of all the values for a more accurate gamma 2.2 ramp applied to R, G, B separately. The PC VR runtime achieves this using a 3D LUT texture that stores the gamma-ramp corrections where the (x,y) coordinates are mapped to the 2D-display surface, and the z coordinate maps to the luminance ramp for each R, G, B component. For the Rift S, the PC VR runtime then applies the color space conversion using the CCM on top of the gamma-ramp correction. Comparatively, sampling a 3D LUT texture in real-time on the Quest GPU would be too costly especially when said LUT is accessed as a dependent-texture read which is an anti-pattern on mobile-class SoC GPUs. This means gamma and color space deviations that might occur in the middle-luminance ranges on the Quest display are simply not accounted for because the CCM can only do a linear transform from one color space to another.
With that disclaimer out of the way, let’s talk about how the CCM is used in the PC and standalone compositors. We briefly mentioned in the High-Level Color Space Correction Flow section that the major work the compositor does is to take the eye buffers images submitted by the VR app, and then apply the necessary modifications to them before showing the modified output on the HMD display to the user.
When the HMD display is active, the compositor is also active as long as there’s an in-focus VR app submitting images. Before every refresh of the display, the compositor prepares a new “final image” generated by resampling the source images sent from the VR app. This means the compositor has to apply color correction every frame for every single visible display pixel, potentially a few times if there’s more than one VR app submitting eye buffers (for example, Beat Saber with boundaries visible simultaneously). To efficiently do this, we use the GPU pixel-shader, a relatively small GPU program running across every display pixel. We run the pixel-shader for the compositor to do the color conversion while doing other things like lens-distortion correction and time-warp reprojection.
As the shader samples each source image pixel’s color, it applies the 3x3 CCM on the three R G B components of the color using matrix multiplication leaving the alpha component untouched. This is where the example of Beat Saber with the boundary comes into play. Since the CCM is multiplied on each set of VR-app-submitted images separately, this allows the compositor to treat each VR app’s source color space appropriately.
Since the CCM is meant to operate on linear (gamma decompressed) color values, we need to make sure that the color value we read from the source texture is converted to linear space before multiplying with the CCM. After the CCM multiplication for each image is done, we then need to gamma compress the color space converted values back to sRGB or gamma 2.2 to be display-ready.
Assuming the VR-app-submitted texture was specified as an sRGB format texture, gamma decompression (that is, converting colors to linear space) from an sRGB-gamma curve is handled automatically by the GPU’s texture-sampler hardware. This allows the shader code to have immediate access to linear values as soon as the texture sampler returns a color sample from the VR app textures. Both the PC and standalone GPUs rely on this feature to avoid the need to perform manual sRGB gamma decompression in the shader.
As for gamma compression, the PC and standalone runtimes do that in different ways, and the difference mainly stems from the available cycles on the gaming-grade PC GPUs compared to the mobile-class SoCs found in the Quest HMD. The PC runtime accurately applies a gamma 2.2 conversion in the shader while the Quest runtime uses the GPU’s automatic conversion back to sRGB gamma. Although the difference is very minute, it’s there for those looking for it, and at least one developer noticed this difference when comparing the PC to the Quest version of their game because their app relied on very dark colors.
Why not use the GPU’s native color-correction pipeline?
All GPUs, be it on Quest or PC, have fixed-function color correction pipelines that can perform color correction practically for free. So you might be wondering why we don’t directly use this. There are a few reasons, mainly due to VR:
We perform chromatic-aberration correction. This requires us to stop modifying the result after the color channels are spatially split apart. If we were to continue color correcting after the split, it can cause a pure red color resulting from chromatic-aberration correction to have a bit of green and blue mixed in it leading to visible color fringing artifacts. For displays that have a narrower color gamut like the Rift S and Quest 2, this issue becomes even more obvious.
GPU color-correction pipelines can only correct for one color space over the whole display. That means we couldn’t have multiple VR apps that need different color spaces.
A single 3x3 CCM matrix multiplication is very cheap even on the Quest GPU.
Summary of Developer Recommendations
To summarize the technical points of this guide:
Use Rift CV1 as your color space.
Use the Rift CV1 color space as the “VR app color space” for all current HMDs unless you have the expertise and a specific need to diverge from it.
Benefits of the Rift CV1 color space are that it provides a good balance of color vibrance and accuracy while using the full color gamut available on our OLED HMDs. In almost all cases, content originally authored for the sRGB color space will continue to look pleasing without any extra tweaks.
Be aware that the whitepoint for the Rift CV1 color space is D75 while most industry color standards are set to use D65. Content originally authored for sRGB will show a slightly cooler output due to the D65 content being treated as D75.
Don’t choose Rec.709 (sRGB) as your color space even if it might feel like the more accurate thing to do. If you do, VR apps running on a Quest 1 or any other wide-gamut display HMD will not be able to target the more saturated colors the display is capable of showing, leading to washed out colors.
If developing with...
then...
Unity or Unreal Engine
• Adjust the color space choice provided in the VR settings. If you don’t see a color space choice, then you might be using an older version of the Meta Quest engine integration and might require upgrading to a newer version. • Unity and Unreal will default to “Rift CV1”.
Native APIs. C-API for PC and VrApi for Quest
• Use the provided SDK APIs to target the necessary color space. These are documented in this article and also linked in the Further Reading section. • If you do not set your color space, PC runtime will default to Rift CV1 while Quest runtime will default to Rec.2020. Be aware that while PC defaults to the recommended color space, Quest does not.
OpenXR
• Use the provided Meta color space extension to target the necessary color space. • If you do not set your color space, both PC and Quest runtimes will default to Rift CV1, the recommended color space.
If you need a high degree of color accuracy...
Calibrate and set up artist authoring workflows to use the color space selected in the VR APIs or engine. This will help artists see a more accurate comparison between their desktop and in VR. The calibration values are provided as CIE-xy coordinates in this document (or in our SDK API reference linked in the Further Reading section).
Consider that because the Rift CV1 color space is not an industry standard, it can be hard to target with some desktop-authoring workflows. As a workaround, you may use DCI-P3 instead as it has wider market adoption with displays that can target 100% gamut coverage. For accurate results, be sure to also select this color space in the VR APIs or engine options.
General color authoring recommendations
Decide on your app’s color space as soon as possible and base it on our recommendations. While you can change color space during production, it will be very challenging to do so without complications if the artists care about color accuracy.
Avoid relying on levels below 13 out of 255 (for 8-bit output) in gamma space or 0.0015 out of 1.0 as linear sRGB shader output. This is because the OLED display found on the Quest 1 and the LCD display found in Quest 2 also have very different contrast ratios. What your eyes can resolve in extreme dark on an OLED display might be completely washed out on Quest 2 due to backlight leakage. If the app tends to keep making use of this range, we recommend you tonemap the output to a more usable, brighter range either in real-time where performance is available, or on the textures where performance is limited such as on Quest.
If possible, develop your apps using Rift CV1 or a Quest 1 HMDs for better color tuning. This is to ensure that the content is making use of the wider gamut available on those displays. That said, we are also aware that these HMDs are no longer sold, making them somewhat difficult to acquire.
Content captured from the HMD in most circumstances will not be appropriate for color-accurate viewing. This includes adb screencap. We do not recommend using them to assess the color accuracy for development purposes unless it’s strictly for relative comparisons of two similarly captured images.
Many Unity and Unreal VR developers develop and test their Quest apps using a PC VR build over Link for fast iteration times. When comparing PC Link and Meta Quest native builds of the same app, the results should look very similar assuming the same color space is chosen for both cases. The only apparent difference should be gamma correction on the very low end where the PC runtime follows the more accurate gamma 2.2 curve while native Meta Quest apps follow sRGB. Gamma 2.2 is more accurate because Meta Quest HMD displays are factory calibrated for 2.2 instead of sRGB. Most of these very dark ranges where the difference lies are mostly unusable on LCD displays.
If you’re a VR developer, keep an eye out for updates in this space to help improve your workflows. If you have any questions, reach out to us over various channels including our Meta Community Forums, Meta Quest Developer Support or your Meta contact.
Finally, be sure to check out our Further Reading section where we list developer API calls around color space management as well as other relevant information.