You are currently viewing archived documentation. Use the left navigation to return to the latest.

Initialization and Sensor Enumeration

This example initializes LibOVR and requests information about the first available HMD.

Review the following code:

// Include the OculusVR SDK
#include "OVR_CAPI.h"
void Initialization() 
  {
    ovr_Initialize();
    
    ovrHmd      hmd = ovrHmd_Create(0);
    
    if (hmd) 
    {
      // Get more details about the HMD.
      ovrSizei resolution = hmd->Resolution;
      ...
    }
    // Do something with the HMD.
    ....
    ovrHmd_Destroy(hmd);
    ovr_Shutdown();
}   

As you can see from the code,ovr_Initializemust be called before using any of the API functions, andovr_Shutdownmust be called to shut down the library before you exit the program. In between these function calls, you are free to create HMD objects, access sensors, and perform application rendering.

In this example,ovrHmd_Create(0)creates the first available HMD. ovrHmd_Create accesses HMDs by index, which is an integer ranging from 0 to the value returned byovrHmd_Detect. Users can callovrHmd_Detectany time after library initialization to re-enumerate the connected Oculus devices. Finally,ovrHmd_Destroymust be called to clear the HMD before shutting down the library.

If no Rift is plugged in during detection,ovrHmd_Create(0)will return a null handle. In this case, you can useovrHmd_CreateDebugto create a virtual HMD of the specified type. Although the virtual HMD will not provide any sensor input, it can be useful for debugging Rift compatible rendering code, and doing general development without a physical device.

TheovrHmdhandle is actually a pointer to anovrHmdDesc struct that contains information about the HMD and its capabilities, and is used to set up rendering. The following table describes the fields:

FieldTypeDescription
TypeovrHmdTypeName of the manufacturer.
ProductNameconst char*Name of the manufacturer.
Manufacturerconst char*Name of the manufacturer.
VendorIdshortVendor ID reported by the headset USB device.
ProductIdshortProduct ID reported by the headset USB device.
SerialNumberchar[]Serial number string reported by the headset USB device.
FirmwareMajorshortThe major version of the sensor firmware.
FirmwareMinorshortThe minor version of the sensor firmware.
CameraFrustumHFovInRadiansfloatThe horizontal FOV of the position tracking camera frustum.
CameraFrustumVFovInRadians floatThe vertical FOV of the position tracking camera frustum.
CameraFrustumNearZInMetersfloatThe distance from the position tracking camera to the near frustum bounds.
CameraFrustumNearZInMetersfloatThe distance from the position tracking camera to the far frustum bounds.
HmdCapsunsigned intHMD capability bits described by ovrHmdCaps.
TrackingCapsunsigned intTracking capability bits describing whether orientation, position tracking, and yaw drift correction are supported.
DistortionCapsunsigned intDistortion capability bits describing whether timewarp and chromatic aberration correction are supported.
DefaultEyeFovovrFovPort[]Recommended optical field of view for each eye.
MaxEyeFovovrFovPort[]Maximum optical field of view that can be practically rendered for each eye.
EyeRenderOrderovrEyeType[]Preferred eye rendering order for best performance. Using this value can help reduce latency on sideways scanned screens.
ResolutionovrSizeiResolution of the full HMD screen (both eyes) in pixels.
WindowsPosovrVector2iLocation of the monitor window on the screen. Set to (0,0) if not supported.
DisplayDeviceNameconst char *System specific name of the display device.
DisplayIdintSystem specific ID of the display device.

Head Tracking and Sensors

The Oculus Rift hardware contains a number of micro-electrical-mechanical (MEMS) sensors including a gyroscope, accelerometer, and magnetometer.

Starting with DK2, there is also an external camera to track headset position. The information from each of these sensors is combined through a process known as sensor fusion to determine the motion of the user’s head in the real world, and to synchronize the user’s virtual view in real-time.

To use the Oculus sensor, you first need to initialize tracking and sensor fusion by calling ovrHmd_ConfigureTracking. This function has the following signature:

  
ovrBool  ovrHmd_ConfigureTracking(ovrHmd hmd, unsigned int supportedTrackingCaps,
                                              unsigned int requiredTrackingCaps);

ovrHmd_ConfigureTracking takes two sets of capability flags as input. These both use flags declared in ovrTrackingCaps. supportedTrackingCaps describes the HMD tracking capabilities that the application supports should be used when available. requiredTrackingCaps specifies capabilities that must be supported by the HMD at the time of the call for the application to operate correctly. If the required capabilities are not present, then ovrHmd_ConfigureTracking returns false.

After tracking is initialized, you can poll sensor fusion for head position and orientation by calling ovrHmd_GetTrackingState. These calls are demonstrated by the following code:

 // Start the sensor which provides the Rift’s pose and motion.
ovrHmd_ConfigureTracking(hmd, ovrTrackingCap_Orientation |
                              ovrTrackingCap_MagYawCorrection |
                              ovrTrackingCap_Position, 0);
                              
// Query the HMD for the current tracking state.
ovrTrackingState ts  = ovrHmd_GetTrackingState(hmd, ovr_GetTimeInSeconds());

if (ts.StatusFlags & (ovrStatus_OrientationTracked | ovrStatus_PositionTracked)) 
{
    Posef pose = ts.HeadPose;
    ...
}

This example initializes the sensors with orientation, yaw correction, and position tracking capabilities if available, while only requiring basic orientation tracking. This means that the code will work for DK1, but will automatically use DK2 camera-based position tracking. If you are using a DK2 headset and the DK2 camera is not available during the time of the call, but is plugged in later, the camera is automatically enabled by the SDK.

After the sensors are initialized, the sensor state is obtained by calling ovrHmd_GetTrackingState. This state includes the predicted head pose and the current tracking state of the HMD as described by StatusFlags. This state can change at runtime based on the available devices and user behavior. For example with DK2, the ovrStatus_PositionTracked flag is only reported when HeadPose includes the absolute positional tracking data from the camera.

The reported ovrPoseStatef includes full six degrees of freedom (6DoF) head tracking data including orientation, position, and their first and second derivatives. The pose value is reported for a specified absolute point in time using prediction, typically corresponding to the time in the future that this frame’s image will be displayed on screen. To facilitate prediction, ovrHmd_GetTrackingState takes absolute time, in seconds, as a second argument. The current value of absolute time can be obtained by calling ovr_GetTimeInSeconds. If the time passed into ovrHmd_GetTrackingState is the current time or earlier then the tracking state returned will be based on the latest sensor readings with no prediction. In a production application, however, you should use one of the real-time computed values returned by ovrHmd_BeginFrame or ovrHmd_BeginFrameTiming. Prediction is covered in more detail in the section on Frame Timing.

As already discussed, the reported pose includes a 3D position vector and an orientation quaternion. The orientation is reported as a rotation in a right-handed coordinate system, as illustrated in the following figure.

Figure 9. Rift Coordinate System

The x-z plane is aligned with the ground regardless of camera orientation.

As seen from the diagram, the coordinate system uses the following axis definitions:

  • Y is positive in the up direction.
  • X is positive to the right.
  • Z is positive heading backwards.

Rotation is maintained as a unit quaternion, but can also be reported in yaw-pitch-roll form. Positive rotation is counter-clockwise (CCW, direction of the rotation arrows in the diagram) when looking in the negative direction of each axis, and the component rotations are:

  • Pitch is rotation around X, positive when pitching up.
  • Yaw is rotation around Y , positive when turning left.
  • Roll is rotation around Z, positive when tilting to the left in the XY plane.

The simplest way to extract yaw-pitch-roll from ovrPose is to use the C++ OVR Math helper classes that are included with the library. The following example uses direct conversion to assign ovrPosef to the equivalent C++ Posef class. You can then use the Quatf::GetEulerAngles<> to extract the Euler angles in the desired axis rotation order.

All simple C math types provided by OVR such as ovrVector3f and ovrQuatf have corresponding C++ types that provide constructors and operators for convenience. These types can be used interchangeably.

Position Tracking

The frustum is defined by the horizontal and vertical FOV, and the distance to the front and back frustum planes.

Approximate values for these parameters can be accessed through the ovrHmdDesc struct as follows:

ovrHmd      hmd = ovrHmd_Create(0);

if (hmd) 
{
    // Extract tracking frustum parameters.
    float frustomHorizontalFOV = hmd->CameraFrustumHFovInRadians;
    ...
   

The following figure shows the DK2 position tracking camera mounted on a PC monitor and a representation of the resulting tracking frustum.

Figure 10. Position Tracking Camera and Tracking Frustum

The relevant parameters and typical values are list below:

FieldTypeTypical Value
CameraFrustumHFovInRadiansfloat1.292 radians (74 degrees)
CameraFrustumVFovInRadiansfloat0.942 radians (54 degrees)
CameraFrustumNearZInMetersfloat0.4m
CameraFrustumFarZInMetersfloat2.5m

These parameters are provided to enable application developers to provide a visual representation of the tracking frustum. The previous figure also shows the default tracking origin and associated coordinate system.

Note: Although the camera axis (and hence the tracking frustum) are shown tilted downwards slightly, the tracking coordinate system is always oriented horizontally such that the and axes are parallel to the ground.

By default the tracking origin is located one meter away from the camera in the direction of the optical axis but with the same height as the camera. The default origin orientation is level with the ground with the negative axis pointing towards the camera. In other words, a headset yaw angle of zero corresponds to the user looking towards the camera.

Note: This can be modified using the API call ovrHmd_RecenterPose which resets the tracking origin to the headset’s current location, and sets the yaw origin to the current headset yaw value.
Note: The tracking origin is set on a per application basis; switching focus between different VR apps also switches the tracking origin.

The head pose is returned by calling ovrHmd_GetTrackingState. The returned ovrTrackingState struct contains several items relevant to position tracking:

  • HeadPose—includes both head position and orientation.
  • CameraPose—the pose of the camera relative to the tracking origin.
  • LeveledCameraPose— the pose of the camera relative to the tracking origin but with roll and pitch zeroed out. You can use this as a reference point to render real-world objects in the correct place.

The StatusFlags variable contains three status bits relating to position tracking:

  • ovrStatus_PositionConnected—this is set when the position tracking camera is connected and functioning properly.
  • ovrStatus_PositionTracked—flag that is set only when the headset is being actively tracked.
  • ovrStatus_CameraPoseTracked—this is set after the initial camera calibration has taken place. Typically this requires the headset to be reasonably stationary within the view frustum for a second or so at the start of tracking. It may be necessary to communicate this to the user if the ovrStatus_CameraPoseTracked flag doesn’t become set quickly after entering VR.

There are several conditions that may cause position tracking to be interrupted and for the flag to become zero:

  • The headset moved wholly or partially outside the tracking frustum.

  • The headset adopts an orientation that is not easily trackable with the current hardware (for example facing directly away from the camera).

  • The exterior of the headset is partially or fully occluded from the tracking camera’s point of view (for example by hair or hands).

  • The velocity of the headset exceeds the expected range.

Following an interruption, assuming the conditions above are no longer present, tracking normally resumes quickly and the ovrStatus_PositionTracked flag is set.

User Input Integration

To provide the most comfortable, intuitive, and usable interface for the player, head tracking should be integrated with an existing control scheme for most applications.

For example, in a first person shooter (FPS) game, the player generally moves forward, backward, left, and right using the left joystick, and looks left, right, up, and down using the right joystick. When using the Rift, the player can now look left, right, up, and down, using their head. However, players should not be required to frequently turn their heads 180 degrees since this creates a bad user experience. Generally, they need a way to reorient themselves so that they are always comfortable (the same way in which we turn our bodies if we want to look behind ourselves for more than a brief glance).

To summarize, developers should carefully consider their control schemes and how to integrate head-tracking when designing applications for VR. The OculusRoomTiny application provides a source code sample that shows how to integrate Oculus head tracking with the aforementioned standard FPS control scheme.

For more information about good and bad practices, refer to the Oculus Best Practices Guide.

Health and Safety Warning

All applications that use the Oculus Rift must integrate code that displays a health and safety warning when the device is used.

This warning must appear for a short amount of time when the Rift first displays a VR scene; it can be dismissed by pressing a key or tapping on the headset. Currently, the warning displays for at least 15 seconds the first time a new profile user puts on the headset and 6 seconds afterwards.

The warning displays automatically as an overlay in SDK Rendered mode. In App rendered mode, it is left for developers to implement. To support timing and rendering the safety warning, we’ve added two functions to the C API: ovrHmd_GetHSWDisplayState and ovrHmd_DismissHSWDisplay. ovrHmd_GetHSWDisplayState reports the state of the warning described by the ovrHSWDisplayState structure, including the displayed flag and how much time is left before it can be dismissed. ovrHmd_DismissHSWDisplay should be called in response to a keystroke or gamepad action to dismiss the warning.

The following code snippet illustrates how health and safety warning can be handled:

    // Health and Safety Warning display state.  
    ovrHSWDisplayState hswDisplayState;      
    ovrHmd_GetHSWDisplayState(HMD, &hswDisplayState);

    if (hswDisplayState.Displayed)
    {
        // Dismiss the warning if the user pressed the appropriate key or if the user
        // is tapping the side of the HMD.
        // If the user has requested to dismiss the warning via keyboard or controller input...
        if (Util_GetAndResetHSWDismissedState()) 
            ovrHmd_DismissHSWDisplay(HMD);
        else
        {
            // Detect a moderate tap on the side of the HMD.
            ovrTrackingState ts = ovrHmd_GetTrackingState(HMD, ovr_GetTimeInSeconds());

            if (ts.StatusFlags & ovrStatus_OrientationTracked)
            {
                const OVR::Vector3f v(ts.RawSensorData.Accelerometer.x,
                                      ts.RawSensorData.Accelerometer.y,
                                      ts.RawSensorData.Accelerometer.z);

                // Arbitrary value and representing moderate tap on the side of the DK2 Rift.
                if (v.LengthSq() > 250.f) 
                    ovrHmd_DismissHSWDisplay(HMD);
            }
        }
    }

With the release of 0.4.3, the Health and Safety Warning can be disabled through the Oculus Configuration Utility. Before suppressing the Health and Safety Warning, please note that by disabling the Health and Safety warning screen, you agree that you have read the warning, and that no other person will use the headset without reading this warning screen.

To use the Oculus Configuration Utility to suppress the Health and Safety Warning, a registry key setting must be added for Windows builds, while an environment variable must be added for non-Windows builds.

For Windows, the following key must be added if the Windows OS is 32-bit:

HKEY LOCAL MACHINE\Software\Oculus VR, LLC\LibOVR\HSWToggleEnabled

If the Windows OS is 64-bit, the path will be slightly different:

HKEY LOCAL MACHINE\Software\Wow6432Node\Oculus VR, LLC\LibOVR\HSWToggleEnabled 

Setting the value of HSWToggleEnabled to 1 enables the Disable Health and Safety Warning check box in the Advanced Configuration panel of the Oculus Configuration Utility. For non-Windows builds, setting an environment variable named Oculus LibOVR HSWToggleEnabled must be created with the value of ”1”.