Guest Blog: Bringing a Trailer to Life for VR with Defense Grid 2
Oculus Developer Blog
|
Posted by Patrick Moynihan, Studio Art Director, Hidden Path Entertainment
|
March 9, 2016
|
Share

Guest Blogs are written by third-party developers with real-world expertise creating virtual reality experiences. They are a great way to dive into the thoughts and opinions of folks on the bleeding edge of VR design and software engineering.

As Defense Grid 2: Enhanced VR Edition raced towards completion, we found ourselves faced with a challenge: how do we create a great trailer for a VR game?

In the past, we’ve been able to capture gameplay footage and use our in-game camera spline tools to produce trailers with a mix of dramatic camera moves and real in-game footage.

We knew we needed to capture the feel of playing the game in VR, with the player looking around and moving their head. This all looks completely fine when looking through the wide angle optics of the Rift headset, but when converted to a narrower field of view on a 16×9 display, the captures looked jerky, uneven, and difficult-to-watch. We ended up with footage that simply didn’t do a good job of selling how the game really feels when you’re playing it in VR.

Smoothing the camera

We needed to find a way to smooth out the movement of the headset while still being able to play the game, which relies on a gaze-based targeting mechanism to build and upgrade towers. Our first thought was to mount the headset to a tripod with a fluid video head. To do this we used zip ties to attach the headset to a base plate (called a cheese plate) which could then be mounted to a tripod.

Cheese plate used for mounting headset

Headset zip-tied to cheese plate

By placing the headset on a tripod, the Rift sensor tracking the headset location would direct the game to move the in-game virtual camera based on that headset’s location and angle. We then display the player’s view, not only to the eye screens inside the headset, but also to an output monitor. For the purposes of capturing footage, we output a 1080p 16×9 image from the center section of the left eye to the main PC screen and then capture that footage directly for putting together our trailer.

The tripod mounting gave us great stability and smooth rotational movement, but the fixed position of the tripod made it impossible to demonstrate positional tracking, which is an important part of the game experience that we really wanted to highlight.

Headset mounted on fluid head and tripod

Our next experiment was to mount the fluid head on a camera slider, which would give us movement on one axis plus rotation. We were able to get some decent shots with this setup, but it still felt very limiting to only be able to move the camera’s position along one axis.

Headset mounted on camera slider

In an attempt to gain complete freedom of movement, we decided to try a “Fig Rig” – mounting the headset on the base plate between two handles which are spaced out about a foot apart.

Because the operator’s hands are spaced out away from the headset, it’s easier to produce smooth rotational movements that look like they could be produced by someone wearing the headset. This configuration produced good results but it was still unmistakably handheld. We wanted something that felt a little bit more polished, so we pushed on.

The Fig Rig worked well for some shots

There are some great 3-axis stabilizers out there like the DJI Ronin. We looked into renting one from a local rental shop, but these gimbals are all designed for cameras that aren’t as wide as the Rift headset. We weren’t confident that we could come up with a way to suitably mount and balance the headset in the confined space of the mounting platform.

This led us to look at the Steadicam. Since these units don’t have complex 3-axis motor cages around the central balance point, it would be easy to attach the headset and fly it on the Steadicam. After renting one and spending a day with it, we discovered a fatal flaw in our plan. The way a Steadicam works is to keep the camera in a near-perfect balance around the gimbal. The headset has a cord hanging off of it which produced significant input into this delicate equilibrium. No matter how hard we tried to minimize the influence of the cable, we were unable to achieve a stable system.

While the Steadicam experiment was a failure, it did lead us to our final solution. At this point we knew we wanted some kind of arm that would allow us to add mass to the headset and therefore make it easier to produce steady movement. Enter the jib:

Jib arm with counterweights for balance and additional mass

One of the things we found with using the jib was unsurprisingly that “standard” Hollywood crane shots were the thing that came most naturally. We were able to capture a ton of great looking swoops and pans (you’ll see several of these shots in the trailer) but they didn’t really convey that we were looking through the player’s eyes.

Honing the technique

What we later realized really helped selling that concept of “head movement” to the viewer was trying to mimic the idea of “a decision being made in real time and the player changing their mind and looking over in another direction.”

So you’ll also see several shots (and we even went back and recorded more of them to really cement it in the trailer) where we are looking one direction, and then we swing to a different direction as smoothly as possible. This really sells the idea that we are “looking through the eyes of a person who is making decisions on the fly about what to look at”, and reinforces the idea that we’re showing them footage through the headset.

Another type of shot that really sold that we were “looking through the player’s eyes” was the UI shot where we built towers during the shot and had the UI come up in the game and used it while we were moving through the space. Again, something that isn’t too typical in a game trailer, but it helped sell that this was “real game footage by a player” which can be difficult to convey. Being able to play the game with the remote taped to the control handle helped make this possible.

Remote taped to handle for single-person operation

Takeaways

A few things to keep in mind:

  1. The Kessler Pocket Jib Traveler we used is very lightweight. It’s easy to introduce oscillations if you aren’t careful. A heavier jib would make this less of a factor. We used one 5lb and one 2.5lb counterweight plates (from the Sears fitness department) to the get the jib balanced with the Oculus headset/fluid head.
  2. It’s hard to get good results by just grabbing the handles and pointing. You have to guide the whole thing with your fingertips and take advantage of the counterweight mass and momentum. This takes practice.
  3. You need a good fluid head mounted on the end of the jib. The Oculus headset is very light compared to a camera so a head with adjustable/defeatable spring tension and drag is key. We used a Manfrotto MH055M8-Q5.
  4. You can do a little bit of stabilization in post though you have to be careful if there is a lot of deep perspective in your shot – it will smooth out the motion, but you can introduce some strange perspective artifacts. Also it resamples your footage so you may want to consider capturing at greater than 1080p.
  5. Capturing at 60 fps makes a big difference, and feels much more like the experience of playing the game in the headset.
  6. If you don’t have access to a jib, cobbling together a Fig Rig is a pretty decent alternative and in some ways is more flexible.

Final Thoughts

In the end, we used the jib for about 80 percent of the footage in the trailer. It provided a nice stable platform, but was flexible enough to allow us to move and aim the camera in a natural way. The jib was key to getting the shots we wanted, but it isn’t a panacea. And of course you are limited to the arc of the arm unless you put the whole thing on a dolly. We haven’t tried that yet – maybe next time!