Part 3 – Zero G Locomotion and Level Design for Space Junkies
The Space Junkies
We are excited to announce that as of today, 3/26/19, Ubisoft's zero-G, multiplayer shooter: Space Junkies is now available on the Oculus Rift. To add to this announcement, we are serving up Part 3 of our 5-part series, as the team from Ubisoft provides further insights into the creation of their latest VR title. If you haven't already, be sure to check out the first two parts of the series: Part 1: Virtual Embodiment and Part 2: Binaural Sound and enjoy these learnings around locomotion and level design.
Designing Navigation in Space Junkies: A Brigitte Story
How do you make a fun, engaging and strategic movement system for an FPS in VR?
Designing a navigation system and a control scheme for just about any video game is a tricky endeavor. In VR, locomotion is even more complex as you have to add hundreds of additional, unknown parameters that are specific to every player. Many of the first VR games focused on teleporting players via “click and teleport”. This is effective, but we wanted to give as much freedom as possible as we create worlds for players to explore, and teleporting always felt like a step backwards from what players are used to on traditional gaming platforms. This was therefore something we really wanted to focus on from the beginning. It was also crucial in our game as we were creating spherical battlespaces in which players could attack from all directions.
When designing the navigation, we looked at using the HMD as an input (controller) and the player essentially uses the HMD when flying as a directional input: you simply look where you want to go. It also works well in a seated position, which allows for a more fluid navigation system that lets you fly fast through these spherical spaces. On top of that, your head and hands are also tracked spatially, so compared to a traditional shooter in “2D”, your aim is directed by your actual hand position and not through a moving reticule. You can peek around corners by tilting and moving your head, and of course the action can now come from left, right, up, down… basically from wherever, which creates intense and challenging battles in deadly orbital arenas!
Unlocking a New World of Potential
This work started with a lot of prototypes: we implemented and tested a wide variety of control schemes, from traditional non-VR games, to more recent VR experiences, such as classic FPS running, climbing, grappling and swimming! Some were a bit weird, some were fun but felt outdated and some were just too tiring. We were constantly trying to remove friction for the player so they could jump in easily and start having fun flying around, especially vertically!
When starting on any new medium, you are trying to find the right angle, and you often get the best inspiration from real life. During the early steps in our navigation, we had the opportunity to meet with specialists and astronauts from the European Space Agency (ESA) who talked us through what it’s like to move around in space. At first, this series of meetings inspired us to follow a hyper-realistic approach using zero-G-based movements (meaning a great deal of inertia). Pure zero-G allows you to move around with minimal effort and it sounds greatly liberating in theory, but in our models, it turned out to be quite uncomfortable to experience, even virtually. When you have two astronauts, one of whom was a test jet pilot and regular space walker, telling us that moving in such an environment can lead to nausea over a sustained period of time even for the hardcore astronauts, we started to think that pure simulation was not such a great idea! It was interesting to hear some of the tricks used in the ISS (International Space Station) to help the brain know which way is up. Since there isn’t really an “upside down” in space, they use lights to help give visitors a sort of anchor for up and down.
Having a full zero-G-based navigation system would also imply that every push, every trajectory and every rotation is a pure commitment. We did try many things, but it was very difficult in terms of spatial awareness, as it gets really complicated and disorientating. It’s also very unforgiving for players who can’t correct and adjust based on how a battle is going. We wanted to make something fast and a bit twitchy, like a sports car or jet-powered skateboard that would be fun for a competitive shooter.
We actually went out and tried different types of aerial experiences, including water-powered jetpacks (also known as flyboards) and sky-diving simulators. Those were great learning experiences. It gave us a better understanding of body position and visual communication since you can’t hear in a wind tunnel.
Sébastien, Game Director, experiencing simulated skydiving
I guess the biggest inspiration actually came from drones. We tried different models in various types of situations and were greatly impressed with some of their qualities. Their maneuverability in an open spherical space was something we really wanted to capture in our navigation system. The drone control scheme served as a useful starting point for our navigation system and for how to take advantage of that “6 degrees of freedom” that VR provides. That freedom of dodging, ducking and weaving turned out to be really responsive in VR, and the lack of motion sickness allowed us to push the fast-paced shooting experience.
Being able to look around naturally in the virtual space is wonderful, and you truly get a sense of scale and spatial feedback that even the biggest movie theatre screen can’t give you. It is just so immersive. We had some funny early sessions with players who had never tried VR, and we kept telling them over and over to look up, but they just wouldn’t move their head. We would tilt their heads back gently and they would gasp, “Woooow, no way!” Essentially, we were re-teaching gamers how to interact in a virtual world. The spatialized audio (explained in part 2 of this blog series) was also a real breakthrough on that front, as it helped to enhance the situational awareness for the player. Whenever you hear something, no matter where the source is located (above you, behind you, etc.), you tend to look at it quite naturally as your ears are telling you where it is and it feels natural, so we are able to use this with jet packs and various special effects to help players know where they are within the environment.
We also went for a first-person perspective for the obvious benefits in terms of immersion. It’s all about that connection between you, your hands and your body – you really feel like you are a part of this virtual world, and that sense of presence is awesome. It brings a unique potential in terms of player expression and the sense of social presence as well, which is incredibly strong in VR. You essentially become this living emoticon. It’s amazing how even small things like shaking or nodding your head changes that social interaction between players in a virtual space.
Tackling The Tough Question
Designing for player comfort also played a huge role, as some people need to adapt to the new VR environment and how to play “differently”. In VR you are in the environment, it is all about you, but the things you would do in a traditional game do not have the same impact (for example, looking at your feet and rotating in a circle). In VR, the effects are comparable to when you look at your feet and turn around in real life: you just might get dizzy! In early testing, players also tried to find holes in the décor, so they could sneak around or go to a different room, so they would force collision into a wall over and over and over again… That’s fine in a 2D game, but in VR it’s like your bashing your head continually against a wall. We are constantly balancing the freedom we give to a player with the level of comfort, as some people are simply more sensitive than others, but players will always try to do things developers never imagined.
Comfort is a focus and a challenge in VR, just as it was in the early days of FPS games like Doom or Wolfenstein. I remember people falling off their chairs and being disorientated at first. VR is no different: we are trying to “re-teach” players how to play a shooter in VR. We have to find ways to help them to make that adjustment all without long tutorials and explanations, especially since we don’t want them to miss out on that magic moment of flying through space with a jetpack strapped to them!
Our main tactic was to iterate and test, test, and test again with a wide variety of players. Stable frame rate is key in VR to ensure that the image remains fluid for the player. The same goes for loading times: when you are in a headset, you want the game to start quickly and fluidly, so those two factors were key. We then added a series of “softeners” that can also help with comfort levels. You may, for example, notice our blinder system when playing, which basically creates black shutters on the sides like horse blinders when you are flying through tight spaces, this helps a great deal with vection. “Vection” is the sensation of movement of the body in space produced purely by visual stimulation, similar to the impression of locomotion when watching a moving train through the window of a stationary train. It also helps with rotations, which I will touch on in a moment.
We also added other anchoring elements, such as the helmet, and the fact that we have full embodiment, not just floating hands. These helped the more sensitive players to feel more “present” in the world.
Another big factor is rotation. Again, situational awareness is key in VR and especially in a shooter, so we introduced a snap rotation. It feels a bit strange at first as players are used to a smooth rotation, but its purpose is to adjust your trajectory, not to rotate like in a classic FPS. It’s on a 45-degree snap, and so is the world, so it allows you to duck and weave quickly and vertically when you get used to using your head to navigate combined with this feature. It takes a few test runs before it clicks, but it really does get you moving fast and comfortably through the environments.
A World Built Around Flying
To make the most out of flying, the player is put in a spherical space, which compared to traditional games means they can navigate in all directions.
To allow for a comfortable experience using that design, the player needs to have clear orientation and can’t be upside down. Much like in the ISS, we ensure that what’s up and down is clearly indicated for player orientation.
The player does not fly in an objectless space, but instead navigates around asteroids, descends narrow corridors, rushes to cover... Level design thus plays a huge role in how we approach navigation in Space Junkies. We think multiplayer maps should always give players options in how they engage in a fight, chase an enemy or fly away from a situation. Players can make their choice based on their assessment of the situation and this helped with a bit of a “mind game” as to what their target is about to do next.
In traditional shooters you will pursue an enemy, and you can be presented the choice to follow them as close as possible (at the risk of being exposed and caught vulnerable), or to take a safer but longer route going from cover to cover (at the risk of losing your target). You can even anticipate their course and wait to ambush the target where you think they will go next. Most of this happens in a room or on a horizontal plane. In VR, that enemy player can go vertical at any time and you could end up flying right by him since we’re dealing with a spherical space. So, as a predator, you have to keep your eye on the prize and look out for those jet pack trails!
In terms of development, having a true spherical battlefield has been a rewarding design challenge. Flanking your enemy from the side and surprising him has always felt fun and gratifying as you outsmart him for not coming from where he expected you. Now how about doing that from above? Or from below? Being able to fly around the map adds many possibilities. Making the best out of verticality makes all the difference here. This was one of the key intentions of our level design: making our navigation system shine by multiplying the options and routes a player can take, while making him think about the best course of action depending on the situation he’s in.
We struggled for a time to achieve this with traditional map editors, designing spherical maps on a 2D screen, and getting that sense of orientation and scale. It was thanks to Brigitte (our game engine powering Space Junkies) and our proprietary map editor that allowed us to design maps in VR. We were able to place, rescale, copy, delete objects and scenery on the fly and experience them in VR – and in-real time – for fast iteration.
When developing Space Junkies, it’s all about iteration, and we are constantly trying to find new ways to help players discover VR, navigate through our worlds and, most importantly, have some good ole fashioned fun!
From The Oculus Team:
We're excited to see what the Ubisoft will share for the last two parts of this series, especially now that Space Junkies is now available. In the mean time, feel free to check out the first two parts of the series below:
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Insights for Creating, Monetizing and Growing on Meta Horizon Worlds
Get insights from GDC on how Meta Horizon Worlds’ unique development path enables you to build quickly, broaden your reach, earn revenue and measure success.
All, Design, GDC, Games, Marketing, Mobile, Multi-User, Optimization, Quest
The Past, Present, and Future of Developing VR and MR with Meta
Take a journey through the past, present, and future of developing VR and MR as Meta’s Director of Games Chris Pruett shares evolving ecosystem trends and audience insights.