Oculus Go Development

On 6/23/20 Oculus announced plans to sunset Oculus Go. Information about dates and alternatives can be found in the Oculus Go introduction.

Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

Train Hand Tracking Sample for Unreal Engine

The Unreal HandsInteractionTrainMap sample scene demonstrates how you can implement hand tracking to enable users to interact with objects in the physics system. In this sample, users can use their hands to interact with near or distant objects and perform actions that affect the scene. For example, the sample enables users to push buttons to turn on smoke or a train whistle, and interact with objects that are further away like the windmills. The purpose of this topic is to introduce you to actors, assets and blueprints that are associated with this sample. You can also use this sample as a starting point for your own app.

The following image shows an example of the sample running.

For more about hand tracking with Unreal Engine, see Hand Tracking.

Get the Sample

The Unreal hand tracking sample is available in the Sample/Oculus directory of the Oculus GitHub Repository. For example, for the Oculus v17 release you can find the sample here. Note that to access the GitHub repository, you must be subscribed to the private EpicGames/UnrealEngine repository. An Unreal license is not required.

For the sample to function properly, you must have version 17 or later of the Oculus integration. For more information on how to get access to the Oculus Git repository, see Version Compatibility Reference.

Sample Walkthrough

Following is a description of key blueprints and other game objects that enable the core hand tracking functionality in this scene. These are all described in further detail later in this topic.

The BP_InteractableToolsManager blueprint creates the tool blueprints, which interact with Interactable objects like the buttons, windmills, and crossing guards. Properties of the BP_InteractableToolsManager actor are:

  • Left Hand Tools: an array of tools that should be added to the left hand
  • Right Hand Tools: an array of tools that should be added to the right hand

The version of BP_InteractableToolsManager in the scene creates two types of tools:

  • Poke tools: intended for near-field interactions such as buttons.
  • Ray tools: intended for far-field interactions such as windmills and crossing guards.

You can add or take away blueprints from either the Left or Right Hand Tools properties. This sample contains the following tools:

  • BP_FingerTipPokeToolIndex
  • BP_FingerTipPokeToolMiddle
  • BP_FingerTipPokeToolPinky
  • BP_FingerTipPokeToolRing
  • BP_FingerTipPokeToolThumb
  • BP_HandRayTool

The BP_InteractableToolsManager uses a InteractableToolsInputRouter to update the collision states of all tools and covers edge cases where the user should use the poke tool instead of the ray tool.

Sample Structure

This section provides an overview of the sample level.

  • BP_MainTrainTrack blueprint sets up the train track and initializes the train actor.
  • BP_HandsActiveChecker warns the user if they accidentally switch to controllers.
  • StaticProps contains all of the static objects, like the weeds, mountains, clouds, and bounds.
  • The windmills and crossing guards are far-field interactables that are usable via far-field interaction (i.e. raycast + pinch). Each interactable has a collidable zone that allows the tools to interact with it. The collidable zones accept far-field tools via tool tags.
  • BP_Sky_Sphere, DirectionalLight, LightmassImportanceVolume, and SphereReflectionCapture are related to the lighting properties of the scene. The sample has one real-time (directional) light and also baked lighting information.
  • BP_ControllerBox is the blueprint that instantiates all buttons and anchors them to the controller box. Each button is an Interactable Button actor and has the following members:
    • ProximityZone: activated when a tool is in proximity to the button
    • ContactZone: activated when a tool starts to touch the button (assuming the interaction is valid)
    • ActionZone: activated when the button clicks or is activated and performs the appropriate action. For example, this can be used to start or stop the train.
    • InteractablePlaneCenter: the “center” of the button, used to filter invalid presses. If a press starts from the wrong direction, it would be in the negative half-space of this plane.
    • ButtonHousing: the portion of the button that doesn’t move.
    • ButtonMeshComp: the part of the button that moves up and down.`
    • ButtonMeshHelper: subscribes to AInteractableButton’s OnInteractableStateChanged, OnContactZoneEvent, and OnActionZoneEvent to change the look of the button during interaction. For example, a press with the poke tool will cause the button to “press” inwards and the ButtonMeshHelper animates this visual change.
    • ButtonGlow: The mesh that gets enabled when a tool is in the proximity zone of the button.
    • AudioComp: the audio component used to fire click noises
  • GroundPlane is a large 1 km x 1 km grid surrounding the sample scene.

Use the Sample Code In Your App

To use the sample components in your own apps, place a BP_InteractableToolsManager in your scene and then add any one of the interactables:

  • BP_InteractableButton
  • BP_Windmill
  • BP_XingPost

Sample Video

The following video shows the running sample and the use of hands in the scene to push buttons and interact with far-field objects.

If you want to try the sample, you can open it in Unreal Engine (with v17 or later of the Oculus integration) and launch to your USB-connected Quest device.

Learn More

For more about hand tracking with Unreal Engine, see Hand Tracking. For an additional sample that shows how to use hand tracking, see Hand Sample.