The Oculus Sample Framework for Unity 5 is an experimental sandbox for VR developers and designers. It consists of a set of simple scenes that illustrate solutions to typical problems in VR, including the mechanics of first-person movement or gaze-based UI. Each sample provides an in-app control panel allowing users to change parameters such as the length of the fade to black in a teleport, the delay on a UI element, or the z-depth of a crosshair reticle. This control panel is itself an illustration of in-VR controls that can be used to help build similar controls in your own applications.
Each scene in the project explores a concept relevant to VR. We have provided a Unity 5 project for the whole framework. an executable for Windows, and a free download in the Gear VR Concepts store. The binaries allow you to dive in and explore a range of ideas without opening the project in the Unity Editor. Each scene provides in-VR documentation, so you can jump straight in. The samples can be downloaded from our
Download Center.
If you would like to experiment with modifying the samples beyond the ranges allowed by in-game controls, building the project in Unity is the way to go.
These samples serve primarily as a tool to explore design ideas in VR and should not be taken as general recommendations. Each sample has controls that allow a range of mechanics to be tested, and some settings reliably produce discomfort. We hope this range of control will allow you to experiment to discover what does and doesn’t work.
A Brief Tour of Sample Scenes
The Sample Framework contains around a dozen different scenes. The best way to learn about the scenes is to load them and try them out. But, as a taster, let’s look at a few scenes…
Crosshair
This scene illustrates a gaze-based crosshair or reticle and explores relevant factors that must be considered. In a non-VR game, it’s quite trivial to draw a crosshair in the middle of the view, but VR developers must decide the depth at which the crosshair should be displayed. A crosshair drawn too close to the viewer is uncomfortable to look at, but a crosshair drawn too far away gives conflicting depth cues when it appears to be further away than the scene geometry it is rendered on top of.
One way to deal with this problem is to raycast along the player’s line of sight and place the crosshair at a depth corresponding to the first object along that line. This keeps the crosshair at a comfortable depth without giving conflicting depth cues. Our sample scene allows you to try this and other modes, and to manually control the depth of the crosshair in real-time.
This scene also looks at other concepts, such as how projectiles travel from a visible weapon. If you use the crosshair for a projectile weapon held at a fixed distance from the player, you must decide whether the projectiles follow a line from the muzzle of the gun through the crosshair, or follow a line parallel with the line drawn from the center of the player’s view to the crosshair. The latter behavior is usually more intuitive, and is similar to the behavior typically seen in a non-VR game. With this sample, you can experience and understand the difference directly.
Locomotion
This sample scene allows you to experiment with first-person movement controls in real time to better understand the relevant design trade-offs. Great care must be taken when developing a motion control scheme based on first-person controls, as it easy to cause discomfort.
The scene allows you to vary factors such as:
- Movement speed
- Turn rate
- Whether or not the direction of travel is controlled by gaze direction
- The “step size” of rotation (either in continuous rotation, or rotation through fixed steps of specified size)
- The speed of animation while performing these “step” rotations (if not instantaneous)
Teleportation
Teleportation is a popular method of locomotion in VR, mainly because it avoids the discomfort problems of first-person controls. This sample scene shows a simple implementation of a teleportation mechanic.
One subtle consideration explored in this sample is the post-teleportation orientation of the traveler. In one mode, the post-teleportation view direction is controlled by the player before making the teleportation jump. The above screenshot shows this in action – a glowing avatar appears at the teleportation destination, indicating a final orientation that can be controlled by the player. After the teleport, the player faces the direction in which the avatar was facing.
This sounds straightforward, but the player may need to rotate their head to look at the teleport destination. So should the player find themselves looking in the direction the avatar was facing after teleport while their head is still rotated? Or should the new view direction correspond to the direction in which they are looking once they rotate their head back to their neutral forward position?
There is a case to be made for either approach, and what feels most intuitive depends on other factors. This sample can help you understand the difference between these two modes directly, and explore your options. The sample provides the possibility to experiment with an interpolation between these two modes.
User Interface
The scenes “Pointers” and “Pointers – Gaze Click” explore the use of pointers in VR to interact with user interfaces. Both scenes explore the mechanics of gaze-based interaction, covering interaction with physical objects and planar UIs.
The scenes are similar, but “Pointers – Gaze Click” does not require the use of gamepad buttons or the Gear VR touchpad. In this scene, gazing at a UI element causes a circular progress bar to appear near the gaze cursor and to immediately begin filling (shown above). When the circle fills, the corresponding UI action is performed.
Other Sample Scenes
We can’t provide a detailed description of all the scenes in this blog post, but each scene has its own documentation viewable in the VR app itself. Other scenes include:
- Keyboard: A simple implementation of an in-VR keyboard.
- Mirror: Shows how an avatar head can be controlled to reflect user movement to give the impression of a mirror in the scene.
- Stairs: Explores the effect of stairs on comfort in a first person experience.
- Surface Detail: Explores different shading techniques for portraying surface detail, allowing you to understand their effectiveness in VR.
- Tracking Volume: Illustrates some ways of indicating when a user gets too close to the edge of the tracking frustum.
Conclusion
This sample set serves as a starting point for understanding and exploring some common design decisions in VR. Needless to say, these samples only scratch the surface of the innovative solutions that are constantly arising in the VR community. Hopefully you’ll find them useful as a starting point for trying out your own experiments and discovering new ideas.
To get started search for “Oculus Sample Framework”on the Gear VR store, or download the project here
/downloads/unity/