Unreal Input

This section describes input handling for Oculus devices.

Gear VR Controller

The Gear VR Controller is an orientation-tracked input device available through Unreal as a Motion Controller.

For a discussion of best practices, see Gear VR Controller Best Practices in Oculus Best Practices.

For instructions on how to add a Motion Controller component to your Pawn or Character, see Motion Controller Component Setup in Epic’s Unreal documentation. Unreal has also provided a detailed training tutorial called Setting Up VR Motion Controllers.

Gear VR positions the controller relative to the user by using a body model to estimate the controller’s position. Whether the controller is visualized on the left or right side of the body is determined by left-handedness versus right-handedness, which is specified by users during controller pairing.

Orientation tracking is handled automatically by the Motion Controller Component. If you need to query the controller orientation, you can query the Motion Controller rotation.

Add the GearVRControllerComponent to create a MotionController with a Gear VR Controller mesh as a child (4.15 and later). The Gear VR Controller mesh may be found in Plugins/GearVR/Content/.

Motion Controller Components must be specified as either left or right controllers when they are added, and each Gear VR Controller button mapping has a left/right equivalent. However, any button click sends both left and right events, so the setting you choose when add the Motion Controller component has no effect.

Input Sample

You will find an example of Gear VR Controller input in our Input sample available in the directory <install>/Samples/Oculus. Please see the sample and its Level Blueprint for a full illustration of how to use the controller in your game, including the button mappings.

Gear VR Controller Swiping Gestures

For Gear VR Controllers, the user interface of your VR experience should follow these natural scrolling and swiping gestures:

  • Swipe up: Pull content upward. Equivalent to scrolling down.
  • Swipe down: Pull content downward. Equivalent to scrolling up.
  • Swipe left: Pull content left or go to the next item or page.
  • Swipe right: Pull content right or go to the previous item or page.

Haptics for Rift Controllers

You may use the standard Play Haptic Effect Blueprint to send a specified haptic curve to the Oculus Touch or Xbox controller. For more information, see Unreal’s Play Haptic Effect guide.

Haptics in Unreal Engine 4.13

PlayHapticEffects may be configured to play haptic waves based on three types of input. Right-click Content Browser to bring up the context menu, then select Miscellaneous. and select one of the following three options:

  • Haptic Feedback Buffer: Plays a buffer of bytes 0-255,
  • Haptic Feedback Curve: Draw the haptic linear curve you wish to play using the Haptic Curve Editor, or
  • Haptic Feedback Soundwave: Select a mono audio file to be converted into a haptic effect of corresponding amplitude.

The following Blueprint illustrates a simple haptics sequence on the Oculus Touch controller using Play Haptic Effect. This example sends vibrations using Play Haptic Effect when the left controller grip button is pressed. When the button is released, Stop Haptic Effect sends a stop command to the Touch controller.

When the left controller X button is pressed, a constant vibration is sent by Set Haptics by Value until the button is released. Note that Set Haptics by Value calls are limited to 30 Hz; additional calls will be disregarded.

Haptics in Unreal Engine 4.12

In addition to Play Haptic Effects, Unreal 4.12 adds Play Haptic Soundwave.

The following Blueprint illustrates a simple haptics sequence on the Oculus Touch controller using Play Haptic Effects and Play Haptic Soundwave. This example sends vibrations using Play Haptic Effect when the left controller grip button is pressed. When the button is released, Play Haptic Soundwave sends a second vibration to the controller.

When the left controller X button is pressed, a constant vibration is sent by Set Haptics by Value until the button is released. Note that Set Haptics by Value calls are limited to 30 Hz; additional calls will be disregarded.

APlayerController::PlayHapticSoundWave takes a mono soundwave as an argument. It downsamples the wave into a series of bytes that serially describe the amplitude of the wave (uint8 values 0-255). Each byte is then multiplied by the factor specified in Scale (max = 255), and haptic vibrations are sent to the targeted Oculus Touch controller. Each controller must be targeted individually. Call Stop Haptic Effect to stop haptic playback.

Haptics Sample

The TouchSample, available from our Unreal GitHub repository, illustrates basic use of Oculus Touch, including haptics control using PlayHapticEffect() and PlayHapticSoundWave(). For more information, see Unreal Samples.

Guardian System Boundary Component

OculusBoundaryComponent exposes an API for interacting with the Oculus Guardian System.

During Touch setup, users define an interaction area by drawing a perimeter called the Outer Boundary in space with the controller. An axis-aligned bounding box called the Play Area is calculated from this perimeter.

When tracked devices approach the Outer Boundary, the Oculus runtime automatically provides visual cues to the user demarcating the Outer Boundary. This behavior may not be disabled or superseded by applications, though the Guardian System visualization may be disabled via user configuration in the Oculus App.

Additional handling may be implemented by applications using the class UOculusBoundaryComponent. Possible use cases include pausing the game if the user leaves the Play Area, placing geometry in the world based on boundary points to create a “natural” integrated barrier with in-scene objects, disabling UI when the boundary is being rendered to avoid visual discomfort, et cetera.

All UOculusBoundaryComponent public methods are available as Blueprints.

Please see OculusBoundaryComponent.h for additional details.

Basic Use

Boundary types are Boundary_Outer and Boundary_PlayArea.

Device types are HMD, LTouch, RTouch, Touch (i.e., both controllers), and All.

Applications may query the interaction between devices and the Outer Boundary or Play Area by using UOculusBoundaryComponent::GetTriggeredPlayAreaInfo() or UOculusBoundaryComponent::GetTriggeredOuterBoundaryInfo().

Applications may also query arbitrary points relative to the Play Area or Outer Boundary using UOculusBoundaryComponent::CheckIfPointWithinOuterBounds() or UOculusBoundaryComponent::CheckIfPointWithinPlayArea(). This may be useful for determining the location of particular Actors in a scene relative to boundaries so, for example, they are spawned within reach, et cetera.

Results are returned as a struct called FBoundaryTestResult, which includes the following members:

MemberTypeDescription

IsTriggering

bool

Returns true if the device or point triggers the queried boundary type.

DeviceType

ETrackedDeviceType

Device type triggering boundary.

ClosestDistance

float

Distance between the device or point and the closest point of the test area.

ClosestPoint

FVector

Describes the location in tracking space of the closest boundary point to the queried device or point.

ClosestPointNormal

FVector

Describes the normal of the closest boundary point to the queried device or point.

All dimensions, points, and vectors are returned in Unreal world coordinate space.

Applications may request that boundaries be displayed or hidden using RequestOuterBoundaryVisible(). Note that the Oculus runtime will override application requests under certain conditions. For example, setting Boundary Area visibility to false will fail if a tracked device is close enough to trigger the boundary’s automatic display. Setting the visibility to true will fail if the user has disabled the visual display of the boundary system.

Applications may query the current state of the boundary system using UOculusBoundaryComponent::IsOuterBoundaryDisplayed() and UOculusBoundaryComponent::IsOuterBoundaryTriggered().

You may bind delegates using the object OnOuterBoundaryTriggered.

Additional Features

You may set the boundary color of the automated Guardian System visualization (alpha is unaffected) using UOculusBoundaryComponent::SetOuterBoundaryColor(). Use UOculusBoundaryComponent::ResetOuterBoundaryColor()to reset to default settings.

UOculusBoundaryComponent::GetOuterBoundaryPoints() and UOculusBoundaryComponent::GetPlayAreaPoints()return an array of up to 256 3D points that define the Boundary Area or Play Area in clockwise order at floor level. You may query the dimensions of a Boundary Area or Play Area using UOculusBoundaryComponent::GetOuterBoundaryDimensions() or

UOculusBoundaryComponent::GetPlayAreaDimensions(), which return a vectors containing the width, height, and depth in tracking space units, with height always returning 0.

Boundary Sample

BoundarySample, available from our Unreal GitHub repository, illustrates the use of the Boundary Component API for interacting with our Guardian System. For more information, see Unreal Samples.