Libovr 1.43 Reference Guide

ovrLayerEyeFov Struct Reference

Describes a layer that specifies a monoscopic or stereoscopic view.

Data Fields

ovrLayerHeader
Header ( )
Header.Type must be ovrLayerType_EyeFov.
ovrTextureSwapChain
ovrTextureSwapChains for the left and right eye respectively.
ovrRecti
Specifies the ColorTexture sub-rect UV coordinates.
ovrFovPort
Fov ( )
The viewport field of view.
ovrPosef
Specifies the position and orientation of each eye view, with position specified in meters.
double
Specifies the timestamp when the source ovrPosef (used in calculating RenderPose) was sampled from the SDK.

Detailed Description

This is the kind of layer that's typically used as layer 0 to ovr_SubmitFrame, as it is the kind of layer used to render a 3D stereoscopic view.
Three options exist with respect to mono/stereo texture usage:
  • ColorTexture[0] and ColorTexture[1] contain the left and right stereo renderings, respectively. Viewport[0] and Viewport[1] refer to ColorTexture[0] and ColorTexture[1], respectively.
  • ColorTexture[0] contains both the left and right renderings, ColorTexture[1] is NULL, and Viewport[0] and Viewport[1] refer to sub-rects with ColorTexture[0].
  • ColorTexture[0] contains a single monoscopic rendering, and Viewport[0] and Viewport[1] both refer to that rendering.
See Also:
ovrTextureSwapChain, ovr_SubmitFrame

Field Documentation

ovrLayerHeader ovrLayerEyeFov::Header ( )
Header.Type must be ovrLayerType_EyeFov.
ovrTextureSwapChain ovrLayerEyeFov::ColorTexture[ovrEye_Count] ( )
ovrTextureSwapChains for the left and right eye respectively.
The second one of which can be NULL for cases described above.
ovrRecti ovrLayerEyeFov::Viewport[ovrEye_Count] ( )
Specifies the ColorTexture sub-rect UV coordinates.
Both Viewport[0] and Viewport[1] must be valid.
ovrFovPort ovrLayerEyeFov::Fov[ovrEye_Count] ( )
The viewport field of view.
ovrPosef ovrLayerEyeFov::RenderPose[ovrEye_Count] ( )
Specifies the position and orientation of each eye view, with position specified in meters.
RenderPose will typically be the value returned from ovr_CalcEyePoses, but can be different in special cases if a different head pose is used for rendering.
double ovrLayerEyeFov::SensorSampleTime ( )
Specifies the timestamp when the source ovrPosef (used in calculating RenderPose) was sampled from the SDK.
Typically retrieved by calling ovr_GetTimeInSeconds around the instant the application calls ovr_GetTrackingState The main purpose for this is to accurately track app tracking latency.
The documentation for this struct was generated from the following file: Include/OVR_CAPI.h