Part 1 - Space Junkies and Virtual Embodiment
Oculus Developer Blog
|
Posted by The Space Junkies Team
|
February 28, 2019
|
Share

We occasionally post insights from the Oculus Developer community so that VR pioneers can share their best practices and help drive the industry forward. Today, we're joined by the Ubisoft team behind their upcoming Oculus Rift title, Space Junkies.

In the first installment of a 5 part guest series, the engineers behind the VR shooter discuss the importance of full virtual embodiment when creating VR experiences.


The first thing that usually strikes new Space Junkies players has nothing to do with jet-packs, guns or aliens (although that usually comes up just a bit later). Most often, what you hear first is something along the lines of, “Oh my gosh, I can use my hands!” sometimes followed by, “Oh, and I can see my legs, too!”

Full virtual embodiment has always been one of the main ambitions of this project as we feel VR provides a unique opportunity to deeply renew the relationship you have with your in-game character and, ultimately, the way you can express yourself in a virtual world.

With this blog article, we wanted to tell you more about this aspect of Space Junkies with the help of one of our programmers, Samuele Panzeri.

Starting From Scratch

Space Junkies has its origins in the work of some of Ubisoft Montpellier’s software engineers fiddling around with Virtual Reality. Many people on the team had worked for several years on games for 4 generations of consoles, including the Kinect and the Wii. Featuring motion controls (a first for a mainstream console), the Wii opened up a completely new range of options in terms of interaction and movements. You could play different sports games, or perform a bunch of interactions that were not possible before, or at least not as intuitive, with standard “classic” controllers.

When the team began working with VR, we were excited about its capabilities as the technology seemed even more far-reaching. Not only does VR allow for motion control, but it also allows for a fully immersive display and touch-sensitive controllers with the Rift and Vive headsets.

We believed that this hardware could accomplish more than floating hands in terms of game design. In typical developer fashion of pushing the limits, we’ve got the head and hands worked out, but now we want to go for arms, legs and so on for maximum expressiveness. How could we not allow players in this day and age to dab? With the appropriate engineering and design, this gen’s hardware could actually achieve full embodiment, meaning a full body and its movements reproduced live in-game.

In order to fully harness and take advantage of the possibilities offered by such features, a brand-new engine dedicated to VR was built from the ground up: Brigitte, as it was cheekily called, included many breakthroughs developed to push the limits of VR, as well as challenging the way IK (Inverse Kinematics) was done.

Getting to Know Brigitte

Inverse Kinematics (IK) is the mathematical process of recovering the position and movements of an object in the world from other data. This technique is used in many domains, from robotics to cinema to, of course, video games.

In Space Junkies, we use a combination of known and new techniques to determine the pose of the whole body. Brigitte’s IK algorithm is specifically designed to reconstitute a full human body based on the 3 data inputs tracked by the hardware, which are the head and each hand for the HTC Vive and Oculus Rift. This allows you to have your hands and head available with 6 different Degrees of Freedom, or “DOF”. This refers to the freedom of movement you get in a three-dimensional space – you’re free to surge forward and backward, sway left and right or navigate up and down. It is crucial that your in-game avatar reacts accordingly, to make it feel comfortable, reactive and natural. That sense of embodiment is important and liberating.

Dear Body of Mine, Where Art Thou?

Space Junkies’ design is based on flying around in micro-gravity, which consequently removes some of the complexities imposed by full embodiment, such as walking. Still, the most challenging part is figuring out your actual body position and making it feel and behave as naturally as possible at any given time. We know where your head and hands are, but knowing where your elbows and shoulders are is a completely different story as they can move in so many different directions.

We tackle the problem of reconstructing the player’s stance in the virtual word in four steps:

  1. First, we establish in which direction the player’s body is facing. Using the position and orientation of the player’s head and hands, as well as his previous position, we can "guess" in which direction he’s most likely facing.
  2. Once we know where the chest is facing, we need to find out where to recreate the rest of the body. Specifically, we start by finding an accurate position for the character’s hips, which is highly likely to match the player’s hips.

    It is important to make an accurate estimate here: if your virtual body has drifted too far from your actual one, you will look down and be confused as your torso won’t be aligned with your legs! If this happens, not only will the player’s immersion be severely impacted, but it will also affect gameplay as the player will not be able to quickly draw from his holsters.
  3. The two previous steps give us a fourth fixed point: the character’s hips. We can now try to rotate all the intermediate joints in the body so that the player’s neck and elbows reach the position we want. This is achieved with the help of an iterative solver.

    What this means is that we rotate every joint by a little amount towards their target at every step and, if we have not reached a valid pose, we keep on trying until we do. It is important to make sure joints will not result in unnatural positions. Wrists cannot rotate 180 degrees inside the arms and your elbows should not end up inside your torso! A pre-computed set of constraints is applied to every joint after it has been moved, getting rid of any results that the human body is unable to perform.
  4. In the fourth and final stage of the algorithm, we try to refine further the solutions found in the previous steps. Using a set of poses, designed by an animator and fed into our IK solver, we adjust the arms of the player to be in a "comfortable" position. For example, if the wrists are excessively stretched, we try to move the avatar elbow in order to relax them. At the same time, we ensure the head and hands have reached the position given from headset and controllers: it does not matter if a pose looks realistic if it does not match the player’s!

Becoming the Emoticon

Building a new engine dedicated to VR with 6 DOF allowed us to design and provide a more immersive experience. Using your equipment and weapons feels intuitive when you reach to your waist or shoulders with your actual hands to grab them. Navigation feels comfortable since you can look up, down, left or right to go anywhere you want.

The result of designing every character’s movements based on your actual movements is that the game contains no pre-recorded character animation whatsoever. It is all up to you and the crazy moves you can do.

This is sustained by the controllers’ capabilities. For instance, touch-sensitive triggers allow us to recreate all kinds of hand gestures in game depending on the position of your fingers (pressing the trigger, finger laid on the trigger or distant from the trigger). This allows you to clench your fist, point to a direction, make a heart sign or a thumbs-up… and we’re barely scratching the surface here.

It gives the player a great deal of self-expression so that you can do your own Victory Dance, your own taunts... You basically become a fully embodied virtual emoticon with extensive interaction. Using your actual hands to high-five or fist-bump your partner virtually brings a strong sense of presence to the game.

The Way Forward

There are already exciting new devices on their way that will push even further how you can use your hands and fingers individually and expand on the idea of players expression. Beyond that, we are always discovering new ways to detect and read the player’s position and movements, allowing to create much more sophisticated avatars and interactions.

These new methods will greatly help the movement estimation process, making it much more subtle and giving us as developers more opportunities to create natural avatars in our worlds for our players.

Space Junkies launches on March 26 on Oculus Rift, HTC Vive, Windows Mixed Reality headsets and Playstation VR.