Object Interaction Part 3: Releasing Objects
Oculus Developer Blog
Posted by Eric Cosky
May 25, 2017

This is the third of a series of four articles that reviews common usage patterns, technical issues, trade—offs and pitfalls that are important to consider when implementing a VR interaction system.

The fundamental goal of any interaction system is that it is as easy to learn as possible, with consistent rules for the users so they can anticipate how to interact with anything in the virtual world, regardless of any special behavior an object might have. When an accurate physical simulation is combined with logic that coordinates input, animation, and scripted behavior, the result is an experience where objects consistently behave as expected during player interactions.

Releasing, Dropping, and Throwing Held Objects

When the user releases an object into the world, there is an expectation that the object will behave in a physically plausible manner. The input latency must be low so the object detaches when and where the user intends, the rigid body must be in a valid state when the object becomes physically active, and the velocity should take into account the motion of the hands prior to release. A correct implementation will allow the user to toss objects from hand to hand and throw objects that move through the environment as the user expects.

Getting correct velocities can be tricky because held objects might not have velocities that make sense depending on the attachment type. For instance, teleporting and hierarchical attachment types will not have velocity and require additional code to set the correct values. Objects attached with fixed joints should have the right velocity if the disconnection occurs after the physics update. Position matching should behave similarly to fixed joints. It’s possible to determine the release velocity based on changes in velocity and position since the previous frame, and it can be helpful to average these values over the past several frames to smooth out irregularities in order to produce a more predictable release trajectory.Another option is to use the Oculus Touch APIs for retrieving the local controller rotation and angular velocity, as demonstrated in the Unity Sample Framework hand samples.

Sometimes it is necessary to adjust the release trajectory in order to meet design goals, such as auto-aiming to targets. It might be necessary to calculate an entirely new trajectory so the object will reach the intended target, although this should probably be limited to cases where the original trajectory is within some tolerance of the ideal trajectory. Other situations might require a more subtle nudge towards a target chosen by some heuristic such as the target closest to where the user is looking.

Persistently Held Objects

While many designs expect users to pick up and drop objects at any time, some applications will attach objects to the player more persistently. An example might be a combat game where constantly holding the grip button to keep a weapon in hand would cause fatigue. In this case, it might be more comfortable to keep weapons attached until the user explicitly holsters the weapon when they need to use the hand for something else. While it might be tempting to use a simplified attachment scheme for scenarios with limited item swapping, a more capable attachment system will provide additional design options such as the ability to swap held items with others found in the environment or in the inventory, or to store held items during interaction with fixed controls.

Breaking Attachments

There are many reasons why a design might require held objects to support collisions with other objects, and there will be times when collisions cannot be resolved without moving or rotating the object from the desired grip position. A small rotation or position offset might be enough to keep objects separated. If this is the case, the attachment might convert to a “soft” attachment where the object is not exactly where it would like to be, but close enough that it can continue to try to reach the goal position without looking visibly broken. At some point, distances or rotations might exceed tolerances and it then becomes reasonable to detach the held object and release it back to the environment. For example, the user may put their hand around a corner and walk away in a direction that prevents the object from reaching the hand without collision. In some designs, it might be acceptable to disregard the collision and snap the object to the hand. If the design requires that objects respect collision rules at all times, then the system should automatically release the object once it reaches the breaking distance or orientation limits.

If the game supports holding objects in each hand then the design also needs to consider the collision behavior between them. Many games will simply disable collision between the held objects, however if the goal is to prevent any visual overlap then it becomes necessary for held objects to collide. When these collisions occur, one of the held objects may need to switch to a soft attachment in order to maintain separation by rotating or separating slightly from the attachment pose, subject to the soft attachment breaking limits described above.

Moving and Held Objects

In many VR applications, users are able to move by teleporting. It is easy to imagine how teleporting the player to a new location could place a held object into an invalid location, however instant rotations of the player are just as capable of moving a held object somewhere it should not be. If player locomotion includes teleports or snap turns, then it will be necessary to decide if, and how, to resolve the collisions that might occur for held objects.

The best way to keep the physics system in a valid state is to only teleport objects to known valid positions and only move kinematic objects when it is certain that dynamic objects will have room to move out of the way. While it is possible to test the destination location for collisions for the player and all held objects, this technique can break designs that would otherwise work in a non-VR game. Specifically, games that rely on geometry to limit where objects can go. A few examples include being able to reach across walls to retrieve objects, or teleporting through narrow doorways with large objects in hand. It may be important to detect these situations and either prevent force held objects to detach so they remain in the area intended by the designer.

Improving Behavior By Adjusting Object Mass

Using position matching velocities for held objects works well when the user is stationary and moving held objects around in their hands. However, when the player teleports the velocities can become large. If held objects moving at high speed towards their final position encounter any environmental objects, the collision forces can cause unexpected chaos in the scene. Ideally, collision response code will modify impulses so that only the held objects will receive separating impulses, but this may not be possible or convenient for some projects. In these cases, it can be useful to reduce the mass of held objects to tiny values while they are moving in response to player teleports.


Stay tuned for the final post in the series, Constrained Interactions, where we discuss issues related to precise placement and interaction with distant objects.