This website uses cookies to improve our services and deliver relevant ads.
By interacting with this site, you agree to this use. For more information, see our Cookies Policy
This section describes input handling for Oculus devices.
The Gear VR Controller is an orientation-tracked input device available through Unreal as a Motion Controller.
For a discussion of best practices, see Gear VR Controller Best Practices in Oculus Best Practices.
For instructions on how to add a Motion Controller component to your Pawn or Character, see Motion Controller Component Setup in Epic’s Unreal documentation. Unreal has also provided a detailed training tutorial called Setting Up VR Motion Controllers.
Gear VR positions the controller relative to the user by using a body model to estimate the controller’s position. Whether the controller is visualized on the left or right side of the body is determined by left-handedness versus right-handedness, which is specified by users during controller pairing.
Orientation tracking is handled automatically by the Motion Controller Component. If you need to query the controller orientation, you can query the Motion Controller rotation.
Add the GearVRControllerComponent to create a MotionController with a Gear VR Controller mesh as a child (4.15 only). The Gear VR Controller mesh may be found in Plugins/GearVR/Content/.
Motion Controller Components must be specified as either left or right controllers when they are added, and each Gear VR Controller button mapping has a left/right equivalent. However, any button click sends both left and right events, so the setting you choose when add the Motion Controller component has no effect.
We provide a sample called GearVRControllerSample in the directory <install>/Samples/Oculus. Please see the sample and its Level Blueprint for a full illustration of how to use the controller in your game.
GearVRControllerSample includes two Blueprints:
The touchpad distinguishes between a touch, in which the user’s finger is touching the pad without pushing it down, and a click.
The touchpad may be queried using FaceButtons to check for clicks on the 12:00, 3:00, 6:00, and 9:00 positions, or with Thumbstick_X/Thumbstick_Y to check for the finger position anywhere on the pad using axis values from -1.0,-1.0 (lower-left corner) to 1.0,1.0 (upper-right corner).
All left/right button events are equivalent. The volume and home buttons are reserved.
Motioncontroller_Left_Thumbstick/Right_Thumbstick | touchpad click anywhere |
MotionController_Left_FaceButton1/Right_FaceButton1 | click at 12:00 |
MotionController_Left_FaceButton2/Right_FaceButton2 | click at 3:00 |
MotionController_Left_FaceButton3/Right_FaceButton3 | click at 6:00 |
MotionController_Left_FaceButton4/Right_FaceButton4 | click at 9:00 |
MotionController_Left_FaceButton5/Right_FaceButton5 | back button |
MotionController_Left_Thumbstick_X/Right_Thumbstick_X | touchpad contact position from -1.0 (left) to 1.0 (right) |
MotionController_Left_Thumbstick_Y/Right_Thumbstick_Y | touchpad contact position from -1.0 (bottom) to 1.0 (top) |
MotionController_Left_FaceButton6/Right_FaceButton6 | touchpad touch |
| Motioncontroller_Left_Trigger/Right_Trigger | trigger |
For Gear VR Controllers, the user interface of your VR experience should follow these natural scrolling and swiping gestures:
You may use the standard Play Haptic Effect Blueprint to send a specified haptic curve to the Oculus Touch or Xbox controller. For more information, see Unreal’s Play Haptic Effect guide.
PlayHapticEffects may be configured to play haptic waves based on three types of input. Right-click Content Browser to bring up the context menu, then select Miscellaneous. and select one of the following three options:
The following Blueprint illustrates a simple haptics sequence on the Oculus Touch controller using Play Haptic Effect. This example sends vibrations using Play Haptic Effect when the left controller grip button is pressed. When the button is released, Stop Haptic Effect sends a stop command to the Touch controller.
When the left controller X button is pressed, a constant vibration is sent by Set Haptics by Value until the button is released. Note that Set Haptics by Value calls are limited to 30 Hz; additional calls will be disregarded.

In addition to Play Haptic Effects, Unreal 4.12 adds Play Haptic Soundwave.
The following Blueprint illustrates a simple haptics sequence on the Oculus Touch controller using Play Haptic Effects and Play Haptic Soundwave. This example sends vibrations using Play Haptic Effect when the left controller grip button is pressed. When the button is released, Play Haptic Soundwave sends a second vibration to the controller.
When the left controller X button is pressed, a constant vibration is sent by Set Haptics by Value until the button is released. Note that Set Haptics by Value calls are limited to 30 Hz; additional calls will be disregarded.

APlayerController::PlayHapticSoundWave takes a mono soundwave as an argument. It downsamples the wave into a series of bytes that serially describe the amplitude of the wave (uint8 values 0-255). Each byte is then multiplied by the factor specified in Scale (max = 255), and haptic vibrations are sent to the targeted Oculus Touch controller. Each controller must be targeted individually. Call Stop Haptic Effect to stop haptic playback.
The TouchSample, available from our Unreal GitHub repository, illustrates basic use of Oculus Touch, including haptics control using PlayHapticEffect() and PlayHapticSoundWave(). For more information, see Unreal Samples.
OculusRiftBoundaryComponent exposes an API for interacting with the Oculus Guardian System.
During Touch setup, users define an interaction area by drawing a perimeter called the Outer Boundary in space with the controller. An axis-aligned bounding box called the Play Area is calculated from this perimeter.
When tracked devices approach the Outer Boundary, the Oculus runtime automatically provides visual cues to the user demarcating the Outer Boundary. This behavior may not be disabled or superseded by applications, though the Guardian System visualization may be disabled via user configuration in the Oculus App.

Additional handling may be implemented by applications using the class UOculusRiftBoundaryComponent. Possible use cases include pausing the game if the user leaves the Play Area, placing geometry in the world based on boundary points to create a “natural” integrated barrier with in-scene objects, disabling UI when the boundary is being rendered to avoid visual discomfort, et cetera.
All UOculusRiftBoundaryComponent public methods are available as Blueprints.
Please see OculusRiftBoundaryComponent.h for additional details.
Boundary types are Boundary_Outer and Boundary_PlayArea.
Device types are HMD, LTouch, RTouch, Touch (i.e., both controllers), and All.
Applications may query the interaction between devices and the Outer Boundary or Play Area by using UOculusRiftBoundaryComponent::GetTriggeredPlayAreaInfo() or UOculusRiftBoundaryComponent::GetTriggeredOuterBoundaryInfo().
Applications may also query arbitrary points relative to the Play Area or Outer Boundary using UOculusRiftBoundaryComponent::CheckIfPointWithinOuterBounds() or UOculusRiftBoundaryComponent::CheckIfPointWithinPlayArea(). This may be useful for determining the location of particular Actors in a scene relative to boundaries so, for example, they are spawned within reach, et cetera.
Results are returned as a struct called FBoundaryTestResult, which includes the following members:
| Member | Type | Description |
|---|---|---|
IsTriggering | bool | Returns true if the device or point triggers the queried boundary type. |
DeviceType | ETrackedDeviceType | Device type triggering boundary. |
ClosestDistance | float | Distance between the device or point and the closest point of the test area. |
ClosestPoint | FVector | Describes the location in tracking space of the closest boundary point to the queried device or point. |
ClosestPointNormal | FVector | Describes the normal of the closest boundary point to the queried device or point. |
All dimensions, points, and vectors are returned in Unreal world coordinate space.
Applications may request that boundaries be displayed or hidden using RequestOuterBoundaryVisible(). Note that the Oculus runtime will override application requests under certain conditions. For example, setting Boundary Area visibility to false will fail if a tracked device is close enough to trigger the boundary’s automatic display. Setting the visibility to true will fail if the user has disabled the visual display of the boundary system.
Applications may query the current state of the boundary system using UOculusRiftBoundaryComponent::IsOuterBoundaryDisplayed() and UOculusRiftBoundaryComponent::IsOuterBoundaryTriggered().
You may bind delegates using the object OnOuterBoundaryTriggered.
You may set the boundary color of the automated Guardian System visualization (alpha is unaffected) using UOculusRiftBoundaryComponent::SetOuterBoundaryColor(). Use UOculusRiftBoundaryComponent::ResetOuterBoundaryColor()to reset to default settings.
UOculusRiftBoundaryComponent::GetOuterBoundaryPoints() and UOculusRiftBoundaryComponent::GetPlayAreaPoints()return an array of up to 256 3D points that define the Boundary Area or Play Area in clockwise order at floor level. You may query the dimensions of a Boundary Area or Play Area using UOculusRiftBoundaryComponent::GetOuterBoundaryDimensions() or
UOculusRiftBoundaryComponent::GetPlayAreaDimensions(), which return a vectors containing the width, height, and depth in tracking space units, with height always returning 0.
BoundarySample, available from our Unreal GitHub repository, illustrates the use of the Boundary Component API for interacting with our Guardian System. For more information, see Unreal Samples.