ovrLayerHeader ovrLayerEyeFov::Header ( )
Header.Type must be ovrLayerType_EyeFov.
ovrTextureSwapChain ovrLayerEyeFov::ColorTexture[ovrEye_Count] ( )
ovrTextureSwapChains for the left and right eye respectively.
The second one of which can be NULL for cases described above.
ovrRecti ovrLayerEyeFov::Viewport[ovrEye_Count] ( )
Specifies the ColorTexture sub-rect UV coordinates.
Both Viewport and Viewport must be valid.
ovrFovPort ovrLayerEyeFov::Fov[ovrEye_Count] ( )
The viewport field of view.
ovrPosef ovrLayerEyeFov::RenderPose[ovrEye_Count] ( )
Specifies the position and orientation of each eye view, with position specified in meters.
RenderPose will typically be the value returned from ovr_CalcEyePoses, but can be different in special cases if a different head pose is used for rendering.
double ovrLayerEyeFov::SensorSampleTime ( )
Specifies the timestamp when the source ovrPosef
(used in calculating RenderPose) was sampled from the SDK.
Typically retrieved by calling ovr_GetTimeInSeconds around the instant the application calls ovr_GetTrackingState The main purpose for this is to accurately track app tracking latency.