This website uses cookies to improve our services and deliver relevant ads.
By interacting with this site, you agree to this use. For more information, see our Cookies Policy
The Oculus Unity Sample Framework provides sample scenes and guidelines for common VR-specific features such as hand presence with Oculus Touch, crosshairs, driving, hybrid mono rendering, and video rendering to a 2D textured quad.
The Unity Sample Framework can guide developers in producing reliable, comfortable applications and avoiding common mistakes. The assets and scripts included with the Sample Framework may be reused in your applications per the terms of our SDK 3.4 license.
It is available as a Unity Package for developers who wish to examine how the sample scenes were implemented, and as binaries for the Rift and Gear VR for developers to explore the sample scenes entirely in VR. The Rift executable is available from our Downloads Center, and the Gear VR application is available from our Oculus Store in the Gallery section.

The Unity Sample Framework requires Unity v 5.4 or later. Please check Compatibility and Requirements for up-to-date version recommendations.
In the Unity project, the following scenes are found in /Assets/SampleScenes:
| Scene | Directory | Concept Illustrated |
|---|---|---|
| Multiple Cameras | Cameras/ | Switching between cameras in a scene. |
| Per-Eye Cameras | Cameras/ | Using different cameras for each eye for a specific object in the scene. |
| Crosshairs | First Person/ | Using crosshairs to aim a weapon in VR and different configuration options. |
| Teleport | First Person/Locomotion/ | A teleportation locomotion scene that reduces the risk of discomfort. |
| Mirror | First Person/ | A simple mirror effect. |
| Outdoor Motion | First Person/ | Basic forms of movement, and the effects a variety of design choices may have on comfort. |
| Scale | First Person/ | How various scale factors interact. |
| Stairs | First Person/ | Factors affecting comfort in first-person stairs movement. |
| AvatarWithGrab | Hands/ | Uses the Unity Avatar SDK and the scripts OVRGrabber and OVRGrabbable to illustrate hands presence with Touch. Pick up and throw blocks from a table using the Touch grip buttons. This sample requires importing the Oculus Avatar SDK. |
| CustomControllers | Hands/ | A simple sample displaying tracked Touch models in a scene. |
| CustomHands | Hands/ | Uses low-resolution custom hand models and the scripts OVRGrabber and OVRGrabbable to illustrate hands presence with Touch. Pick up and throw blocks from a table using the Touch grip buttons. May be used as a reference for implementing your own hand models. |
| Input Tester | Input/ | This scene assists with testing input devices, displaying axis values in real time. |
| Keyboard | Input/ | A virtual keyboard. |
| Movie Player | Rendering/ | Video rendering to a 2D textured quad using the Android Media Surface Plugin. Source for the plugin ships with the Mobile SDK in \VrAppSupport\MediaSurfacePlugin. |
| Surface Detail | Rendering/ | Different ways to create surface detail with normal, specular, parallax, and displacement mapping. |
| PerfTest | Rendering/ | Demonstrates hybrid mono rendering, in which near content is rendered stereoscopically and distant content is rendered monoscopically using scripts and shaders. |
| StereoMonoRoom | Rendering/ | Another implementation of the hybrid mono rendering feature also demonstrated by PerfTest (see above). |
| OverlayUIDemo | UI/ | Demonstrates creating a UI with a VR Compositor Layer to improve image quality and anti-aliasing. Includes a quad overlay for Rift, and a quad and a cylinder overlay for mobile. |
| Pointers | UI/ | How UI elements can be embedded in a scene and interact with different gaze controllers. |
| Pointers - Gaze Click | UI/ | An extension of the Pointers scene, with gaze selection. |
| Tracking Volume | UI/ | Different ways to indicate the user is about to leave the position tracking volume. |
These samples are intended to be tools for exploring design ideas in VR, and should not necessarily be construed as design recommendations. The Sample Framework allows you to set some parameters to values that will reliably cause discomfort in most users - they are available precisely to give developers an opportunity to find out how much is too much.
It is as important to play test your game on a range of players throughout development to ensure your game is a comfortable experience. We have provided in-game warnings to alert you to potentially uncomfortable scenes.
To download the Oculus Sample Framework Unity Project or Rift binary, visit our Downloads Center. The Gear VR Sample Framework application may be downloaded for free from the Gallery Apps section of the Oculus Store.
To run the PC Binary
To open the project in Unity Editor:
This is only necessary if you want to experiment with the project, as application binaries are provided by Oculus for free download.
To build the Unity Project for Rift:
To build the Unity Project for the Gear VR:
Sample scenes are browsed and controlled with a simple UI which provides in-app explanatory notes. Parameter controls allow users to adjust settings, providing an immediate, direct experience of the impact of different design decisions. The Sample Framework control panel itself is an example of in-VR control and navigation, and may be used as a model for your own applications.
We have provided a Windows executable for use with the Oculus Rift or DK2. A Samsung Gear VR may be downloaded for free from the Gallery Apps section of the Oculus Store. These applications are simply builds of the Unity project.
Navigation
Launch the Sample Framework on Rift or Gear VR to load the startup scene. You will see the Inspector, a three-pane interface providing controls for scene settings, documentation, and navigation controls for browsing to other scenes. Making selections on the top-level menu on the left panel changes the content of the other two panels. The center panel is a contextual menu, and the right panel displays notes and instructions for the current scene.

Inspector navigation is primarily gaze-controlled, supplemented by a mouse and keyboard, a gamepad (PC or Gear VR), or the Gear VR touchpad.
To launch a scene from the center panel, you may select and click the scene with a mouse, gaze at the scene name and press the A button on a gamepad, or tap the Gear VR touchpad.
Some scenes are grouped into folders (displayed as buttons). When browsing from a folder, select the “..” button to navigate one level up in the scene hierarchy.
Scrolling
Some panels support vertical scrolling. Several methods of scrolling are supported in order to illustrate some of the available options for implementing this feature. The following methods are supported: