Developer Perspectives: Character Animation in ‘Dead and Buried’
Pärtel Lang
We occasionally post insights from the Oculus Developer community so that VR’s early pioneers can share their practices and help push the industry forward. Today, we feature Pärtel Lang, the IK Specialist on the Dead and Buried team, an independent developer dedicated to the research of character animation systems, and the founder of RootMotion, a small company based in Tartu, Estonia.
Solving the IK Problem in Multiplayer VR Experiences
The dawn of consumer VR introduces many new challenges for game design and programming, including new problems with character animation. How do we see ourselves in VR, and how do others see us? Current tracking technology is limited to just the user’s head and hands, so how do we display their bodies? Most of today’s VR games don’t attempt to render the player’s own body at all. But in multiplayer VR games, players expect to see a realistic representation of the other players—not just floating heads and hands. The challenge is to generate plausible full-body animation from the available tracking data and maintain the suspension of disbelief, immersion, and sense of presence as much as possible.
This article describes a new inverse-kinematics solver called VRIK, which was developed for the game Dead and Buried, a multiplayer first-person cover shooter for Rift and Touch. The game’s design required realistic avatars for the other players, and the following article briefly guides you through some of the main challenges and solutions we came across during the development process.
Generating Full-Body Motion From Tracking
The incoming data from Rift and two Touch controllers consists of a position (3D vector) and a rotation (quaternion) for each device. This provides very little information about the player’s full-body pose, and in most cases, there are multiple valid solutions for any given input.
Inverse kinematics (IK) is the technique typically used to generate the angles for a chain of joints given just the positions of the end effectors (i.e. the hands or head in this case), but there’s not enough data for this to work by itself.
Equally valid poses for identical controller positions
The absence of obvious analytical solutions means that some empiricism must be employed in solving the task. The farther we look from the body parts attached to the controllers, the greater the probability of miscalculation and the risk of breaking immersion. We’re forced to make an educated guess, so we need that guess to be as intelligent as possible. The best source of knowledge and inspiration here comes from observing people actually playing the game.
The main problems are:
Finding the most relaxed pose. Based on observation, this means avoiding extreme angles and singularity points while maintaining the ability to cover most of the vast range of human motion. In multiplayer games, it’s also important to be able to convey hand/head gestures in a natural-looking way.
Achieving the look and feel of the motion. In some cases, it’s important to preserve the look and feel of the motion, matching the theme of the game and the characters, even when the player’s actual pose doesn’t comply with it.
Locomotion. With no information about the lower body pose, how do we know where to place the feet? We have to choose between an animated, procedural, or some kind of a hybrid solution for locomotion.
Maintaining balance. Since the head and hands are locked to the controllers, it’s important to place the feet and bend the spine so that the final pose appears adequately balanced.
Supporting different body proportions. Real people and virtual characters come in all shapes and sizes. How do we manage to reach when our virtual hands are smaller than our own?
Dealing with invalid input. Sometimes the controllers get obscured or the player puts one of them down and moves away. How can we prevent the avatar from twisting up in weird poses?
The Solver
Currently, there are no full-body IK solutions available that meet the very specific requirements of VR content development. Besides the accuracy and overall quality of the solver, it’s also vital for the solver to be highly efficient and performant, since VR is already a relatively big burden on the CPU. In addition to that, every avatar in a multiplayer game will depend heavily on full-body IK, not just use it for small cosmetic adjustments, so there isn’t much opportunity to optimize for avatars out in the distance. Therefore, VR requires IK to be solved at high frequency as well as high quality. Everything becomes observable in close detail, and in first-person view it even needs to live up to comparison to reality—the player’s real flesh and bones.
Considering all that, we decided to create a new solver (VRIK) that’s fully dedicated to the three controller problem—a hybrid solver combining analytic and heuristic algorithms, employing each for their specific qualities. The solver composes the final pose, solving each body part sequentially, and allows full control in and between each step and each iteration.
The following paragraphs describe the solving process in more detail.
Step 1: Spine
As noted earlier, the solver deals with each body part one by one. Since the position of the head is our primary concern, the solver holds the positional/rotational input from the HMD as its primary goal (HMD tracking is also more reliable and less likely to get occluded). It first needs to determine the bend and twist angles for the spine and the neck. Observing natural human motion provides an understanding of how much people normally bend their spines when looking up or down, left or right, or tilting their heads.
It’s not only the rotation of the HMD that provides important input here—the positions of the hands are also a valuable source of information (mostly for guessing the final angles of the chest bone). If you try moving your hands left or right while holding your head still, you’ll notice how your spine twists along with the chest following the hands. The same goes for moving one hand up and the other down. While all that can be done with the chest held still, it’s considerably more comfortable to relax the spine and let it bend, and therefore this is the behavior that we should aim for. Taking this into consideration, the HMD plays the lead role in spine calculations, but the hand controllers have an equally important duty in modifying the chest rotation around its vertical and forward axes, based on their horizontal positions on the XZ plane and heights respectively.
With and without using hand controller positions in spine calculations
Step 2: Locomotion
Locomotion is the most complex problem for the solver to deal with. With no information about the lower body pose, it needs to be reactive, responsive, appear natural and not too robotic (unless the avatar is a robot, of course). It’s easy for the animation to end up looking like a marionette being dragged around by its head. The initial solution for Dead and Buried was a classic eight-directional strafing animation controller. Based on the horizontal distance of the head anchor from the avatar’s head, a stepping animation was played when it exceeded a certain threshold. A similar setup was used for turning, using the angle between the forward axis of the HMD and the forward axis of the avatar’s root transform. However, that solution proved to be too slow and inaccurate for a fast-paced shooting game like Dead and Buried where players would quickly jump in and out of cover. Also, the turning and stepping blend trees were conflicting, as there was no good solution for turning and moving at the same time. So we decided to go for fully procedural locomotion instead to gain maximum control over the feet.
The locomotion solver works on the principle of footsteps. The feet are planted to the footsteps and won’t budge unless a step is triggered. Calculating when and where to step and with which foot is the biggest challenge here. Real humans make side-steps in order to maintain balance, and therefore so does the solver. With the feet planted to the footsteps, the hands to the hand controllers, and the spine bending already calculated, we have a fairly good approximation of the center of mass (CoM) and the center of pressure (CoP) for the avatar. The latter lies more or less halfway from the left foot to the right, while the former is probably most quickly approximated by getting the centroid of the triangle formed by the HMD and the hand controller positions.
Center of mass, center of pressure, and the balance vector
We need to convert the loss of ability to maintain balance into a scalar value that can be used to trigger footsteps. That scalar value can be very cost-efficiently described as the angle between the inverse gravity vector and the vector from the CoP to the CoM. The greater the angle, the less balanced the character appears, so a footstep is triggered once the angle passes a certain threshold and the direction for that footstep is that same balance vector orthogonally projected onto the ground plane. Similarly, for turning on the spot, an angular threshold is employed. It’s also important to check and make sure that the stepping foot won’t cross the other foot and get the legs tangled. Procedural locomotion provides us with a chance to check for any obstacles like walls and objects by ray-casting from the current footstep to the next. Checking all those conditions before making a step is important for achieving good looking, logical, and meaningful locomotion.
In real life, people don’t lean sideways and start making a step only after having hopelessly lost balance. Normally people raise one of their feet immediately as they start moving. Unfortunately we don’t have the luxury of knowing whether an acceleration of the HMD means the player just moved his/her head to peek out from cover or started going somewhere, so we can’t start stepping based on that information alone. We can, however, use the velocity of the headset and the hand controllers as a predictive measure and decrease the balance angle threshold based on the magnitude of velocity to achieve a more responsive solution for locomotion.
Step 3: Legs
Once the solver has dealt with the spine and locomotion, it’s time to plant the feet to the footsteps. The easiest way to do leg IK is to use a simple three-joint trigonometric solver based on the Law of Cosines, which is the fastest kind of IK possible and always a perfectly accurate analytic solution. The main problem with such a solver in this case is that it can only handle three joints and will end with the ankle bone. Anchoring ankles to the footsteps isn’t the best solution for VR, as it takes away the ability for the avatar to rise up on its toes and decreases the range of motion for the headset before having to take a side-step to compensate for the static ankle. That’s why VRIK employs a dual-pass trigonometric solver (one solving the knee joint, the other the ankle) that anchors not the ankle but the toes to the footsteps.
Having solved the legs, we might find that the head is too high for the feet to even reach the ground. A decision has to be made whether to let the feet float or plant them and have the head drift instead. In my experience, it’s best to plant the feet, as them floating around would look worse from a third-person perspective than inaccuracies with the head that would hardly be noticed at all. Therefore, the solver simply moves the spine so that the hips are positioned in a way that allows both toes to reach their targets.
Step 4: Arms
Similar to leg IK, the arms can also be solved using a simple trigonometric solver. The biggest challenge here isn’t actually solving the upper arm, forearm, and hand, but the shoulders. Rotating the shoulder bones directly towards the hand target doesn’t support the wide range of motion of human arms. For instance, grabbing a weapon from the right shoulder with the left hand would cause the shoulder to flip backwards at some point. As the solver knows the position and orientation of the chest bone by now, it can use it as the local space in which to do all the calculations for the arms. VRIK uses a set of specific rules and angle offsets to deal with the shoulders, also clamping their rotation to make sure they stay within a valid range.
Since the solver has no information about the elbows, it has to try to guess the normal of the bend plane for the arms. It has three sources of information to base that guess upon: the world space position and rotation of the hand controller, plus the position of the hand controller relative to the chest bone. The hand bone can be rotated without moving the elbow at all, so again, that guess can never be conclusive, but it can provide us with a natural-looking and relaxed solution. VRIK uses an empirically-found mixture of those three information sources to compose a vector defining the bend plane for the arm, and it has proved to be able to perform most gestures in a good looking way.
Summary
In contrast to most full-body IK solvers, which are heuristic by nature, VRIK—the IK used in Dead and Buried— is more like a collection of analytic and heuristic solvers. That design lets us insert custom rules and procedures required by the specifics of VR development at every step, rather than being constrained by the nature and limitations of an algorithm. In terms of performance, it will probably outperform other full-body solvers as it has outperformed Full Body Biped IK (a component in Final-IK) by a factor of at least two to three times.
VRIK is a small step towards providing more realistic VR avatars. In the future, we plan to improve the solver with additional features like internal collision avoidance, stretch and squash, eye motion, and joint limits.
About the Author
In addition to being the IK Specialist on the Dead and Buried team, Pärtel Lang is an independent developer dedicated to the research of character animation systems and the founder of RootMotion, a small company based in Tartu, Estonia.
RootMotion develops animation tools for the Unity platform. Final-IK, a complete inverse kinematics solution for Unity, has been available in the Asset Store since 2014 and PuppetMaster, an advanced character physics tool, since the end of 2015. Special thanks to Andrew Welch, Ryan Rutherford and Fabio Brasiliense for being a major help with the development and testing of VRIK!
Locomotion
Rift
Unity
Explore more
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Strategies and Considerations for Designing Incredible MR/VR Experiences
Dive into design and rendering tips from GDC 2025. Hear from the teams behind hit games like Batman: Arkham Shadow and Demeo and learn how you can make your mixed reality and VR games more enjoyable and accessible.
Accessiblity, All, Apps, Avatars, Design, GDC, Games, Hand Tracking, Optimization, Quest, Unity
GDC 2025: Emerging Opportunities in MR/VR on Meta Horizon OS
Discover some of the opportunities we shared at GDC 2025 to accelerate your development process, level up your skillset and expand your reach with Meta Horizon OS.