Oculus Launch Pad Grad Justin Palmer Shares the Creative Process Behind Mend

Oculus Developer Blog
Posted by Oculus VR
February 18, 2021
Launch Pad

Each year, Oculus Launch Pad supports promising VR content creators from diverse backgrounds with hands-on training and support, so they can iterate on their unique ideas and bring them to market.

2019 Launch Pad grant recipient and Digital Precept Creative Director Justin Palmer spoke with us about his involvement with Oculus Launch Pad and how it helped shape his career and the development of Mend, an asymmetric co-op puzzle platformer.

If you’re interested in applying for our next Oculus Launch Pad session, watch for our open application announcement on the blog this Spring.

Congrats on receiving an Oculus Launch Pad grant! What was your Launch Pad experience like and how has your involvement made an impact on your career?

Thanks! Launch Pad was definitely a wild ride. I applied to the program at the recommendation of one of my friends, Kathryn Hicks, a super talented artist and fellow Launch Pad Alum. I didn’t think I’d make the cut, so I was ecstatic when I saw the acceptance email show up in my inbox. But yeah, Launch Pad was amazing. Networking at the boot camp was a lot of fun, all the panels there were super educational, and then attending Oculus Connect 6 was the cherry on top. The whole week had this creative energy to it that I haven’t really felt anywhere else.

I’m mostly a self-taught developer. There’s always been this nagging feeling in my head about whether or not I’m headed in the right direction or doing something in the right way, so I think the biggest impact Launch Pad made was to validate my skills to myself. Not only that, but these days I find myself wearing the Creative Director hat over at Digital Precept; an XR studio I run with my friends and business partners Jonathan Schrack and Craig Herndon. Like I said, wild ride!

What are your top tips for devs hoping to be more inclusive and reach a broader audience?

Accessibility is huge. I think there’s a common misconception that an accessible game is an easy game, but I would argue that an accessible game gets out of the player’s way. Beat Saber does a really good job with this. There are several different gameplay options, visual effects can be tweaked or turned off completely, and not to mention the input for the core mechanic is intuitive (swing a laser sword). Likewise for Mend, the VR player only really needs one button, the grip trigger, to play the game. There’s also an inventory system, but it’s bound to a hold-your-hand-out gesture instead of a button.

I was at PAX South in January last year, and I still had a couple weeks before I needed to turn in Mend’s vertical slice. PAX usually opens with a “Story Time with...” talk from someone in the industry, and 2020 opened up with Rod Fergusson. A lot of his talk was dedicated to inclusivity and accessibility in game design, and it was definitely an eye-opener for me. I took a ton of notes and I’m hoping I can apply those lessons to Mend.

Did you run into any major technical challenges? If so, how did you overcome those challenges?

Absolutely. The audio engine that Mend uses required a bunch of customizations to fit the game’s needs. I had never worked with an audio engine, and normally I don’t mind rolling my sleeves up and tinkering with unfamiliar technology, but I already had my hands full building Mend’s vertical slice. Thankfully, Jonathan was able to jump in and fully implement a solution.

What he managed to pull off is really cool. The game sends independent spatialized audio streams to each player, to an audio driver of their choice, at the same time, from the same device. In other words, the VR player will only hear the sfx intended for them. The same goes for the flatscreen player. Neither player experiences audio bleed over!

The way this works normally is that the audio engine creates one instance of its RuntimeManager class. As the name implies, this class drives all the audio functionality for the game. Any audio listeners and audio emitters in a scene only need to worry about that one single Manager. This is where we throw a wrench into things. For Mend, the game instead creates two instances of the RuntimeManager class; one instance dedicated to the VR player, and another instance for the flatscreen player. Because the game now has two Managers, the sound emitters and listeners also need to know which instance to work with. This can all be specified from inside the Unity editor. This is all a vast oversimplification, but the point is that it was definitely worth all the extra work.

What were the biggest design challenges for building an asymmetrical game?

Building gameplay spaces that are fun and engaging for both players. The characters in Mend have vastly different perspectives, so I always have to be cognizant of their scale as I’m blocking a level out. Sometimes I greybox a space, and it feels good in VR, but then I run around as the flatscreen player and everything feels much too big and empty. Also, I need to make sure the VR player has enough level geometry to grapple with. The locomotion system in Mend is kind of like rock climbing; The VR player grapples level geometry and they can pull themselves around. If there’s too much to hold on to, the space can start to feel cramped or cluttered. Not enough grabbable level geometry and then locomotion becomes a chore. There’s definitely a goldilocks zone I need to work in, but I believe Mend has struck that balance.

What did you learn from your experience playtesting the game?

I want to say that I learned how to observe playtesters more effectively. I had my playtesters record their gameplay sessions, which let me analyze and replay specific moments during a playtest (as well as keep everyone safe during the pandemic). This proved to be invaluable. I could really pay attention to how the players were working together, how they went about problem solving and communicating with each other, or how well they learned a new mechanic the game was presenting to them. If they were running into problems somewhere, why? Did I leave enough signifiers for them to inuit what needs to be done? Enough constraints to know that approach won’t work? Is the game giving them enough feedback when they correctly figure something out? Is the level flow confusing, are the puzzles making sense? Oh yeah, and are the players even having fun? There’s a ton of feedback playtesters can provide, both explicitly and implicitly.

Footage from an early playtest

Can you discuss what your main source of inspiration was for the game?

So this is a funny story.

I didn’t have a specific idea of something to make when I was accepted into Launch Pad. I just knew I wanted to do something with VR. I’m a bit of a game jam junkie, so I figured that I could let my intuition drive my creative process (i.e., wing it). I got home and immediately prototyped a handful of different ideas over the course of a few weeks. I didn’t like anything was shaping up, and at this point I was starting to feel really anxious because I had just spent the first few weeks of a three-month-long development cycle building dead ends. I decided it would be best to step away from ideating for a couple days.

Fast forward those couple days, and I was at an after work happy hour with some friends from work when the dreaded question came up

“Hey, Justin, how’s your game coming along?”

“Uh.. not well.”

After explaining the situation and admitting I had run into a wall, everyone started pitching ideas, or games and cartoons they loved growing up and how they could be adapted to VR. This is where we struck gold.

“Do you remember those old cartoons where the artist would draw a character, it would spring to life, and then the two would fight it out? That has a cool creator-vs-creation vibe to it!”

That stuck. When I got home, I immediately started to brain dump every idea into a notebook. Initially, the game was a competitive player-vs-player affair, still with a smaller flatscreen player and a larger VR player, but it slowly shifted to a more cooperative experience as I kept working. After a few pages, I kitbashed some stuff together in Unity and immediately fell in love with the prototype. So, yeah, long story short, my inspiration came from my friends… and maybe a pint or two.

What influenced the overall look and feel of the game?

I knew that I wanted something bright, colorful, and stylized. I ended up referencing other games that had a similar look I was after. I remember looking at Legend of Zelda: The Wind Waker, Team-Fortress 2, Job Simulator, Budget Cuts… I think I might be missing a few, but those are the stand-outs.

What advice would you give to a developer looking to start building for VR?

Honestly, just jump into it and start tinkering with things when you can. There’s a good amount of documentation out there, and plenty of communities to join these days. What helped me a ton was finding a local Unity meetup and participating in game jams.

Is there anything else you’d like to share with our developer audience?

I want to playtest your games! Oh, and feel free to stop by and say hi on Mend’s Discord!