You are currently viewing archived documentation. Use the left navigation to return to the latest.

Client Distortion Rendering

In the client distortion rendering mode, the application applies the distortion to the rendered image and makes the final Present call.

This mode is intended for application developers who want to combine the Rift distortion shader pass with their own post-process shaders for increased efficiency. It is also useful if you want to retain fine control over the entire rendering process. Several API calls are provided which enable this, while hiding much of the internal complexity.

Set Up Rendering

The first step is to create the render texture that the application will render the undistorted left and right eye images to.

The process here is essentially the same as the SDK distortion rendering approach. Use the ovrHmdDescstruct to obtain information about the HMD configuration and allocate the render texture (or a different render texture for each eye) in an API-specific way. This was described previously in the Render Texture Initialization section of this document.

The next step is to obtain information regarding how the rendering and distortion should be performed for each eye. This is described using theovrEyeRenderDesc struct. The following table describes the fields:

FieldTypeDescription
EyeovrEyeTypeThe eye that these values refer to (ovrEye_LeftorovrEye_Right).
FovovrFovPortThe field of view to use when rendering this eye view.
DistortedViewportovrRectiViewport to use when applying the distortion to the render texture.
PixelsPerTanAngleAtCenterovrVector2f Density of render texture pixels at the center of the distorted view.
ViewAdjustovrVector3fTranslation to be applied to the view matrix.

Call ovrHmd_GetRenderDesc for each eye to fill in ovrEyeRenderDesc as follows:

    // Initialize ovrEyeRenderDesc struct.
    ovrFovPort eyeFov[2];
    
    ...

    ovrEyeRenderDesc eyeRenderDesc[2];
    
    EyeRenderDesc[0] = ovrHmd_GetRenderDesc(hmd, ovrEye_Left, eyeFov[0]);
    EyeRenderDesc[1] = ovrHmd_GetRenderDesc(hmd, ovrEye_Right, eyeFov[1]);    
 

Set Up Rendering

In client distortion rendering mode, the application is responsible for executing the necessary shaders to apply the image distortion and chromatic aberration correction.

In previous SDK versions, the SDK used a fairly complex pixel shader running on every pixel of the screen. However, after testing many methods, Oculus now recommends rendering a mesh of triangles to perform the corrections. The shaders used are simpler and therefore run faster, especially when you use higher resolutions. The shaders also have a more flexible distortion model that allows us to use higher-precision distortion correction.

OculusRoomTiny is a simple demonstration of how to apply this distortion. The vertex shader looks like the following:

float2 EyeToSourceUVScale, EyeToSourceUVOffset;
float4x4 EyeRotationStart, EyeRotationEnd;
float2 TimewarpTexCoord(float2 TexCoord, float4x4 rotMat)
{
    // Vertex inputs are in TanEyeAngle space for the R,G,B channels (i.e. after chromatic 
    // aberration and distortion). These are now "real world" vectors in direction (x,y,1) 
    // relative to the eye of the HMD. Apply the 3x3 timewarp rotation to these vectors.
    float3 transformed = float3( mul ( rotMat, float4(TexCoord.xy, 1, 1) ).xyz);

    // Project them back onto the Z=1 plane of the rendered images.
    float2 flattened = (transformed.xy / transformed.z);

    // Scale them into ([0,0.5],[0,1]) or ([0.5,0],[0,1]) UV lookup space (depending on eye)
    return(EyeToSourceUVScale * flattened + EyeToSourceUVOffset);
}

void main(in float2 Position    : POSITION,    in float timewarpLerpFactor : POSITION1,
          in float Vignette     : POSITION2,   in float2 TexCoord0         : TEXCOORD0,
          in float2 TexCoord1   : TEXCOORD1,   in float2 TexCoord2         : TEXCOORD2,
          out float4 oPosition  : SV_Position, out float2 oTexCoord0       : TEXCOORD0,   
          out float2 oTexCoord1 : TEXCOORD1,   out float2 oTexCoord2       : TEXCOORD2,
          out float  oVignette  : TEXCOORD3)
{
    float4x4 lerpedEyeRot = lerp(EyeRotationStart, EyeRotationEnd, timewarpLerpFactor);
    oTexCoord0  = TimewarpTexCoord(TexCoord0,lerpedEyeRot);
    oTexCoord1  = TimewarpTexCoord(TexCoord1,lerpedEyeRot);
    oTexCoord2  = TimewarpTexCoord(TexCoord2,lerpedEyeRot);
    oPosition = float4(Position.xy, 0.5, 1.0);
    oVignette = Vignette;   /* For vignette fade */
}

The position XY data is already in Normalized Device Coordinates (NDC) space (-1 to +1 across the entire framebuffer). Therefore, the vertex shader simply adds a 1 to W and a default Z value (which is unused because depth buffering is not enabled during distortion correction). There are no other changes. EyeToSourceUVScale and EyeToSourceUVOffset are used to offset the texture coordinates based on how the eye images are arranged in the render texture.

The pixel shader is as follows:

Texture2D Texture   : register(t0);
SamplerState Linear : register(s0);

float4 main(in float4 oPosition  : SV_Position, in float2 oTexCoord0 : TEXCOORD0,
            in float2 oTexCoord1 : TEXCOORD1, in float2 oTexCoord2 : TEXCOORD2,
            in float oVignette : TEXCOORD3)   : SV_Target
{
    // 3 samples for fixing chromatic aberrations
    float R = Texture.Sample(Linear, oTexCoord0.xy).r;
    float G = Texture.Sample(Linear, oTexCoord1.xy).g;
    float B = Texture.Sample(Linear, oTexCoord2.xy).b;
    return (oVignette*float4(R,G,B,1));
}
  

The pixel shader samples the red, green, and blue components from the source texture where specified, and combines them with a shading. The shading is used at the edges of the view to give a smooth fade-to-black effect rather than an abrupt cut-off. A sharp edge triggers the motion-sensing neurons at the edge of our vision and can be very distracting. Using a smooth fade-to-black reduces this effect substantially.

As you can see, the shaders are very simple, and all the math happens during the generation of the mesh positions and UV coordinates. To generate the distortion mesh, call ovrHmd_CreateDistortionMesh. This function generates the mesh data in the form of an indexed triangle list, which you can then convert to the data format required by your graphics engine. It is also necessary to call ovrHmd_GetRenderScaleAndOffset to retrieve values for the constants EyeToSourceUVScale and EyeToSourceUVOffset used in the vertex shader. For example, in OculusRoomTiny:

 
//Generate distortion mesh for each eye
for ( int eyeNum = 0; eyeNum < 2; eyeNum++ )
{
    // Allocate & generate distortion mesh vertices.
    ovrDistortionMesh meshData;
    ovrHmd_CreateDistortionMesh(hmd, 
                                eyeRenderDesc[eyeNum].Eye, eyeRenderDesc[eyeNum].Fov, 
                                distortionCaps, &meshData);

    ovrHmd_GetRenderScaleAndOffset(eyeRenderDesc[eyeNum].Fov,
                                   textureSize, viewports[eyeNum],
                                   (ovrVector2f*) DistortionData.UVScaleOffset[eyeNum]);
                                     
    // Now parse the vertex data and create a render ready vertex buffer from it
    DistortionVertex * pVBVerts = (DistortionVertex*)OVR_ALLOC( 
                                       sizeof(DistortionVertex) * meshData.VertexCount );
    DistortionVertex * v        = pVBVerts;
    ovrDistortionVertex * ov    = meshData.pVertexData;
    for ( unsigned vertNum = 0; vertNum < meshData.VertexCount; vertNum++ )
    {
        v->Pos.x = ov->Pos.x;
        v->Pos.y = ov->Pos.y;
        v->TexR  = (*(Vector2f*)&ov->TexR);
        v->TexG  = (*(Vector2f*)&ov->TexG);
        v->TexB  = (*(Vector2f*)&ov->TexB);
        v->Col.R = v->Col.G = v->Col.B = (OVR::UByte)( ov->VignetteFactor * 255.99f );
        v->Col.A = (OVR::UByte)( ov->TimeWarpFactor * 255.99f );
        v++; ov++;
    }

    //Register this mesh with the renderer
    DistortionData.MeshVBs[eyeNum] = *pRender->CreateBuffer();
    DistortionData.MeshVBs[eyeNum]->Data ( Buffer_Vertex, pVBVerts,
                                           sizeof(DistortionVertex) * meshData.VertexCount );

    DistortionData.MeshIBs[eyeNum] = *pRender->CreateBuffer();
    DistortionData.MeshIBs[eyeNum]->Data ( Buffer_Index, meshData.pIndexData,
                                           sizeof(unsigned short) * meshData.IndexCount );

    OVR_FREE ( pVBVerts );
    ovrHmd_DestroyDistortionMesh( &meshData );
}
  

For extra performance, this code can be merged with existing post-processing shaders, such as exposure correction or color grading. However, to ensure that the shader and mesh still calculate the correct distortion, you should do this before and after pixel-exact checking. It is very common to get something that looks plausible, but even a few pixels of error can cause discomfort for users.

Game Rendering Loop

In client distortion rendering mode, the application is responsible for executing the necessary shaders to apply the image distortion and chromatic aberration correction.

The following code demonstrates this:

    ovrHmd hmd;
    ovrPosef headPose[2];

    ovrFrameTiming frameTiming = ovrHmd_BeginFrameTiming(hmd, 0); 

    pRender->SetRenderTarget ( pRendertargetTexture );
    pRender->Clear();
    
    for (int eyeIndex = 0; eyeIndex < ovrEye_Count; eyeIndex++)
    {
        ovrEyeType eye = hmd->EyeRenderOrder[eyeIndex];
        headPose[eye] = ovrHmd_GetEyePose(hmd, eye);

        Quatf orientation = Quatf(eyePose.Orientation);
        Matrix4f proj = ovrMatrix4f_Projection(EyeRenderDesc[eye].Fov, 
                                                       0.01f, 10000.0f, true);
                        
        // * Test code *
        // Assign quaternion result directly to view (translation is ignored).
        Matrix4f view = Matrix4f(orientation.Inverted()) * Matrix4f::Translation(-WorldEyePosition);

        pRender->SetViewport(EyeRenderViewport[eye]);
        pRender->SetProjection(proj);       
         
        pRoomScene->Render(pRender, Matrix4f::Translation(EyeRenderDesc[eye].ViewAdjust) * view);
    }    

    // Wait till time-warp point to reduce latency.
    ovr_WaitTillTime(frameTiming.TimewarpPointSeconds);


    // Prepare for distortion rendering.
    pRender->SetRenderTarget(NULL);
    pRender->SetFullViewport();
    pRender->Clear();

    ShaderFill distortionShaderFill(DistortionData.Shaders);
    distortionShaderFill.SetTexture(0, pRendertargetTexture);
    distortionShaderFill.SetInputLayout(DistortionData.VertexIL);

    for (int eyeIndex = 0; eyeIndex < 2; eyeIndex++)
    {
        // Setup shader constants
        DistortionData.Shaders->SetUniform2f("EyeToSourceUVScale",
        DistortionData.UVScaleOffset[eyeIndex][0].x, DistortionData.UVScaleOffset[eyeIndex][0].y);
        DistortionData.Shaders->SetUniform2f("EyeToSourceUVOffset",
        DistortionData.UVScaleOffset[eyeIndex][1].x, DistortionData.UVScaleOffset[eyeIndex][1].y);

        ovrMatrix4f timeWarpMatrices[2];
        ovrHmd_GetEyeTimewarpMatrices(hmd, (ovrEyeType) eyeIndex, headPose[eyeIndex], 
                                      timeWarpMatrices);

        DistortionData.Shaders->SetUniform4x4f("EyeRotationStart", Matrix4f(timeWarpMatrices[0]));
        DistortionData.Shaders->SetUniform4x4f("EyeRotationEnd",   Matrix4f(timeWarpMatrices[1]));

        // Perform distortion
        pRender->Render(&distortionShaderFill,
                        DistortionData.MeshVBs[eyeIndex], DistortionData.MeshIBs[eyeIndex]);
    }
                         
    pRender->Present( VSyncEnabled );
    pRender->WaitUntilGpuIdle();  //for lowest latency
    ovrHmd_EndFrameTiming(hmd);