HandsInteractionTrainMap sample scene demonstrates how you can implement hand tracking to enable users to interact with objects in the physics system. In this sample, users can use their hands to interact with near or distant objects and perform actions that affect the scene. For example, the sample enables users to push buttons to turn on smoke or a train whistle, and interact with objects that are further away like the windmills. The purpose of this topic is to introduce you to actors, assets and blueprints that are associated with this sample. You can also use this sample as a starting point for your own app.
The following image shows an example of the sample running.
For more about hand tracking with Unreal Engine, see Hand Tracking.
The Unreal hand tracking sample is available in the
Sample/Oculus directory of the Oculus GitHub Repository. For example, for the Oculus v17 release you can find the sample here. Note that to access the GitHub repository, you must be subscribed to the private EpicGames/UnrealEngine repository. An Unreal license is not required.
For the sample to function properly, you must have version 17 or later of the Oculus integration. For more information on how to get access to the Oculus Git repository, see Version Compatibility Reference.
Following is a description of key blueprints and other game objects that enable the core hand tracking functionality in this scene. These are all described in further detail later in this topic.
BP_InteractableToolsManager blueprint creates the tool blueprints, which interact with
Interactable objects like the buttons, windmills, and crossing guards. Properties of the
BP_InteractableToolsManager actor are:
Left Hand Tools: an array of tools that should be added to the left hand
Right Hand Tools: an array of tools that should be added to the right hand
The version of
BP_InteractableToolsManager in the scene creates two types of tools:
You can add or take away blueprints from either the Left or Right Hand Tools properties. This sample contains the following tools:
BP_InteractableToolsManager uses a
InteractableToolsInputRouter to update the collision states of all tools and covers edge cases where the user should use the poke tool instead of the ray tool.
This section provides an overview of the sample level.
BP_MainTrainTrackblueprint sets up the train track and initializes the train actor.
BP_HandsActiveCheckerwarns the user if they accidentally switch to controllers.
StaticPropscontains all of the static objects, like the weeds, mountains, clouds, and bounds.
SphereReflectionCaptureare related to the lighting properties of the scene. The sample has one real-time (directional) light and also baked lighting information.
BP_ControllerBoxis the blueprint that instantiates all buttons and anchors them to the controller box. Each button is an
Interactable Buttonactor and has the following members:
ProximityZone: activated when a tool is in proximity to the button
ContactZone: activated when a tool starts to touch the button (assuming the interaction is valid)
ActionZone: activated when the button clicks or is activated and performs the appropriate action. For example, this can be used to start or stop the train.
InteractablePlaneCenter: the “center” of the button, used to filter invalid presses. If a press starts from the wrong direction, it would be in the negative half-space of this plane.
ButtonHousing: the portion of the button that doesn’t move.
ButtonMeshComp: the part of the button that moves up and down.`
ButtonMeshHelper: subscribes to
OnActionZoneEventto change the look of the button during interaction. For example, a press with the poke tool will cause the button to “press” inwards and the
ButtonMeshHelperanimates this visual change.
ButtonGlow: The mesh that gets enabled when a tool is in the proximity zone of the button.
AudioComp: the audio component used to fire click noises
GroundPlaneis a large 1 km x 1 km grid surrounding the sample scene.
To use the sample components in your own apps, place a
BP_InteractableToolsManager in your scene and then add any one of the interactables:
The following video shows the running sample and the use of hands in the scene to push buttons and interact with far-field objects.
If you want to try the sample, you can open it in Unreal Engine (with v17 or later of the Oculus integration) and launch to your USB-connected Quest device.