Oculus Go Development

On 6/23/20 Oculus announced plans to sunset Oculus Go. Information about dates and alternatives can be found in the Oculus Go introduction.

Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

Hand Tracking in Unreal Engine

The hand tracking feature enables the use of hands as an input method on Oculus Quest devices. It provides a new sense of presence, enhances social engagement, and can help deliver more natural interactions.

Using simple hand gestures such as pinch, poke, and pinch and hold, you can integrate hand tracking in your apps so users can select, click, scroll, drag and drop, return, or exit in your app.

The hand tracking feature allows you to develop UI elements that can be operated with hands and controllers interchangeably. When you opt to use hands, for near-field interactions, users can use their hands to pinch or poke objects. For far-field interactions, the hand’s pose drives a laser cursor-pointer that behaves like the standard controller cursor. Use the cursor-pointer to highlight, select, click, or write your own app-level event logic.

Note that hand tracking complements Touch controllers, but is not intended to replace controllers in all scenarios, particularly with games or creative tools that require a high degree of precision.

Prerequisites

If you have not previously implemented input motion controllers in Unreal Engine, see UE’s Motion Controller Component Setup.

Before you start implementing hand tracking in your app, see the Hand Tracking Design Guidelines for terminology, best practices and interaction models when using hands as an input source in virtual reality.

For an overview about how hand tracking is implemented and how it has been used in apps, see the OC6 Video Presentation: Hand Tracking Deep Dive: Technology, Design, and Experiences.

Hand Tracking Architecture

The following image shows the architecture of the Oculus hand tracking implementation for UE4, and how it input information from hands is routed using the same mechanism that controller input is routed.

As shown in the diagram, Oculus Input remains the main source of input data for UE4. Oculus Input routes hand input through the UE4 input system the same way that controller buttons and sticks are. Pinches and pinch strength are also routed as hand input.

Hand-specific features like the mesh/skeleton and bone rotation are provided through the OculusHandTracking class which is contained within the Oculus Input module. The OculusHandTracking class provides the blueprint library as well as access hand specific data like hand scale, pointer pose, bone rotation, and more.

Turn On Hand Tracking in Your Project

You can turn on hand tracking in Unreal Engine in the Project Settings, which adds the com.oculus.permission.HAND_TRACKING entry to the Android manifest for your project.

Go to Edit > Project Settings, go to Plugins and select OculusVR. Under Hand Tracking Support, choose:

  • Controllers: hand tracking will not be enabled for your app
  • Controllers and Hands: A user can use hand tracking or controllers in your app
  • Hands Only: A user must have hand tracking enabled on their device to use your app

Integration Details

The Oculus hand tracking integration for Unreal Engine features the following.

Updates to the Oculus Input Module

In summary, the Oculus input model has these additions for hand tracking:

  • The Oculus Input Module supports input from touch controllers and hand tracking.
  • The Oculus Input Module relays hand pinches and pinch strength through UE4’s input event system.
  • The module will update and store new hand poses, which can be access through blueprints or by the OculusHandComponent

Specifically:

  • FOculusHandState struct has been added. Similar to the controller-state structs, this struct provides the current hand state inputs and tracked state

  • Pinch inputs are updated with key events and axes for pinches and pinch strength
    • Register new key names and axes defined for hands in the UE4 input system. These identify fingers of each hand as Thumb, Index, Middle, Ring and Pinky.

    • Pinches and Pinch strength can be bound to the UE input settings to that their events can be associated with blueprints `OculusHandComponent. See Input Bindings in the next section for how to do this.

  • Updating Hand Pose

Input Bindings for Hand Tracking

You can bind to hand tracking inputs like pinches and pinch strength using Unreal Engine’s input system. To create a new input binding with hand tracking:

  • Go to Edit > Project Settings Find Engine > Input.
  • Under Action Mappings or Axis Mappings, add a new mapping.
  • For the mapping key value, search for the Oculus Hand category to bring up the various hand tracking input bindings. The following image shows an example:

hand tracking input binding options

Hand Tracking Blueprints

The Oculus integration for Unreal Engine offers several resources, including several blueprints.

BlueprintDescription
GetBoneNameReturns the name of the bone from the bone Id
GetBoneRotationReturns all bone rotations corresponding to the hand type passed in.
GetDominantHandReturns which hand is the dominant user hand
GetHandPointerPoseReturn the current hand pointer pose.
GetHandScaleReturns the hand scale
GetSkeletalMeshFromTypeReturns a USkeletalMesh for the specified hand. Use the returned mesh to assign to a USkeletalMeshComponent
GetTrackingConfidenceReturns the tracking confidence of hands.
InitializeHandPhysicsInitializes physics capsules on the runtime hand mesh
IsHandTrackingEnabledReturns true if hand tracking is enabled on the device.
IsPointerPoseValidReturns true of the pointer pose is valid

Oculus Hand Component

The OculusInput module for hands. This component is a subclass of Unreal’s UPoseableMeshComponent, and must be a child of UMotionController, which provides the tracking pose and late-update functionality for hands.

The component handles loading the mesh/skeleton as well as updating the bones. This component also handles setting new materials for the hand, hiding hands when tracking is lost/confidence is low. Options to update root pose, update root scale, set pointer pose root, and enable physics capsules.

The following image shows and example of these properties, and how to set them in UE4.

hand properties

Handling System Gestures

If the user performs the system gesture, to return to Oculus Home or access the menu, the gesture will be surfaced through the OVRPlugin as an ovrpButton_Start signal, and a status flag, similar to the user pressing the Oculus home key or menu button on a controller. You will not need special menu logic for hands in this case.

The following image shows the pinch gesture as well as the system gesture.

Dominant Hand

Dominant hand features are surfaced through Hand Status flags of the OVRPlugin. You can access this information by using the blueprint function GetDominantHand.

Sample Implementation

See the following samples: