We occasionally post insights from the Oculus developer community so that VR’s early pioneers can share their practices and help push the industry forward. Today, we feature Owlchemy Labs self-proclaimed Founder, CEO, and Janitor Alex Schwartz to go behind the scenes of Job Simulator.
Tell us a little bit about yourself and why you began developing Job Simulator for Rift and Touch.
Alex Schwartz: I’ve always been interested in the concept of VR, but it only became something that was within reach with the announcement of the Oculus Rift Kickstarter. Once Devin Reimer (co-founder and CTO, Owlchemy Labs) and I got our hands on the Oculus DK1, we immediately dropped everything and ported the base-jumping title AaaaaAAaaaAAAaaAAAAaAAAAA!!! to VR, creating Aaaaaculus! When it came to creating Job Simulator, we knew that hand-tracked controls would be crucial to creating compelling VR experiences, so coming to Touch was a no-brainer. Since our first taste of 6DOF hand controllers, Owlchemy has been entirely committed to making sure that we’re available on every platform that supports them!
How did you adapt your thinking for VR?
AS: Our thought process for VR involved throwing absolutely everything out the window and starting from scratch. VR is an entirely new medium with new methods of interaction and new ways in which it affects us. Nothing feels quite like VR with tracked hands, and it’s essential to build for the platform and design something that’s unique to VR. To try and make something without taking that into consideration is really doing both the platform and any created experience a disservice. So instead, we jumped into VR with as few expectations as possible and a sense of discovery in mind. By doing so, we were able to explore and play until we found fun. Through prototyping and by letting the medium dictate what was possible, we were able to arrive at a brand new genre of game: a “job simulator,” if you will.
What was the most exciting part of developing Job Simulator for VR?
AS: We’re most excited about VR’s accessibility! With tracked-hands, navigating a game becomes as easy and natural as reaching out and grabbing something in the real world. By reducing interaction abstractions—no more pressing B to crouch, or A to jump—we found that everyone, regardless of their previous experiences, had the ability to easily and intuitively play.
Can you talk a bit about your development practices related to the multiple level layouts for room-scale/360 with two-sensor setups?
AS: The most important objective for us was making sure Job Simulator would “just work” for all players in all situations and room sizes. To that end, we specifically wanted to give people as much play area as they had available in the real world, without forcing them to navigate menus or manual setups, or answering a myriad of questions. Scaling up or down the room in VR wouldn’t give the right feeling, so we chose “hard mode”—to manually build a number of sized rooms, lay out each by hand, and build a system to automatically detect which room to use for your particular setup. It was a lot of work, but the smooth final experience for the players makes it 100% worth it.
What challenges arose from developing for VR that you didn’t expect?
AS: To make objects and tools in Job Simulator, we ended up having to rethink a lot of everyday designs to make them easy and intuitive for VR interaction. An example of this can be found in the Auto Mechanic job. When we initially implemented tires, we had a series of steps that had to be completed that were modeled after the actual process of replacing a tire—lug nuts had to be tightened or loosened, the tires had to be inflated or deflated via a manual pump, etc. While the process was a fairly accurate representation of what’s necessary in real life, it was difficult to communicate and, to be honest, not that much fun. So we simplified the game mechanic to allow the player to snap the tires on and off, creating a juicier and much more fun experience while reducing the repetition necessary to actually do car repair. This happened in several places. Our tea kettle in the kitchen doesn’t have a lid for similar reasons, the blender uses a slot machine-style arm to blend, and the coffee machine just has one button. Even though these objects don’t look exactly like this in real life, the simplicity of the design is able to better communicate the possibilities of play, much like how children's toys simplify real-world interactions.
What analytics are important to you, and how do you use them to improve your VR experience?
AS: We use Unity Analytics in Job Simulator to give us some basic insights into player behavior, but we’ve found that VR analytics are muddied with bad data. For example, if you look at play session length, the data is completely wrong due to the fact that players, after a fantastic VR play session, sometimes put the headset down on the ground and walk away to talk with friends or move onto other things. So while play time reported might be extremely high, it may be incorrect data. Also, early VR has trended toward “VR Parties” where multiple users will pass and play a game together. Without a way to figure out if this is happening, the data can sometimes be misleading. For that reason, we find that observing users actually playing the game in person is our best resource for generating data. We do in-house testing with friends and family and users who have never tried VR before. Once someone’s familiar with Job Simulator, their testing becomes essentially useless to us since we’re trying to perfect the first-time user experience. We have folks in twice a week during development and once a day during the polish phase of a project. We use Trello to compile bugs and triage them into tasks.
How did you determine your business goals for Job Simulator, and what are your thoughts on progress toward those goals?
AS: For Owlchemy, we never actually set out to have a specific revenue goal for Job Simulator. We knew that we needed to make back our costs and allow for a way to continue to develop for VR moving forward, but this game was purely meant to be the optimal showcase of the incredible accuracy, realism, and fun that accurately tracked VR could bring players of all ages. With a market that’s so early in its development, we really just wanted to make an approachable and wildly entertaining piece of content that demoed well and would hopefully be the go-to experience for players to show their friends how amazing VR could be. We couldn’t be prouder of the fact that so many people have enjoyed the game and continue to do hilarious and creative things in the game and show it to their friends, family, and the world.
Thanks for the insights, Alex! We appreciate your time.
— The Oculus Team