To get our maiden VR release running smoothly on Oculus Go, we had to do a lot of research: you may have already seen some of the available articles from Unity about
optimizing your game for mobile hardware, or this OC5 video presentation about
porting your Rift game to Quest. At Thunderbox we always try and do something a little different, so this post is going to feel a bit like those... only backwards.
How hard can it be to take a VR experience designed to run on a mobile device, and upgrade it for a ninja PC? Not that hard actually... provided you know what you are doing, and focus on delivering maximum user bang for your dev buck.
Our title,
Tsuro: The Game of The Path, is based on the family
board game of the same name, so the core gameplay is quite simple: players take turns to put tiles on a board, the game pieces follow the paths on those tiles, and are removed if a path takes them over the edge. The winner is the last player still in play. The virtual version of Tsuro is 70 times larger than life, and set in an ancient Japanese temple. Because of the aforementioned simplicity of gameplay, we focused on improving the immersion and environmental fidelity.
As you can see below, Tsuro has a great low-poly style, that looks awesome on Go; today we will outline the steps we took to make it worthy of the extra horsepower on the Oculus Rift...
6DOF & Touch
The first thing we tackled was 6DOF head tracking and the Oculus Touch controllers. We built Tsuro on a PC, we already had most of this implemented to speed up our iteration time, but there were a few finishing touches needed. As our environment had slopes, we scripted a simple height adjustment so users could walk around the rooftops and ramps. As the Go version of Tsuro had been designed to work with a single controller, we also had to make sure players could easily switch between left and right-handed control at will, along with having the option to choose Oculus Touch controllers for Rift or Rift S (auto detection is currently scheduled for v 1.38 of
the Unity Integration).
Ultimately the most crucial update to the controllers was adding decent haptic feedback. The explosion below is so much cooler when you can feel it…
It’s fairly easy to use your existing audio files to drive the controllers’ haptics, but we found that a more exaggerated waveform felt much better, so we used a state-machine to directly control the rumble, and generated waveforms on the fly using our tweening system (the one for the explosion is shown below). This gave us precise control over the force feedback, with a faster iteration time. If you want to learn more about the custom tweening system, check out my previous Oculus developer guest post:
Insights from Creating an Oculus Go Puzzle Game.
I cannot stress the difference good haptics make to a game - once we implemented this, everything felt so much more “real”.
On a separate note, at this point you should have a version of your game that will be pretty tight on Oculus Quest. This would be a good time to branch your repository accordingly.
PBR Shaders
Now it’s time to leverage that high-end, graphical power! We replaced all of the highly performant, vertex lit, mobile shaders with snazzy PBR shaders for an instant boost in graphical quality. The specular lighting on PBR shaders really does make the environment feel more tangible and immersive.
Easy, eh? Alas no... the change in lighting response meant that our diegetic UI was totally unreadable. We had to carefully tune the lighting (more on that later), tweak the UI materials, and recolour a good chunk of the interface, just to get it legible again. Once we had the UI back, we tuned the material properties of certain elements to give a little more polish, and replaced the mobile water shader with something a bit fancier. Watch out with water: reflection and refraction require specific shader tweaks to work in VR.
Tuning Lighting and Shadows
Perhaps the toughest part of this transition was the
lighting. On mobile, to get decent performance while still looking good, we stuck to the following criteria...
The environment is a single object
The environment is unlit and lighting is all baked in advance
Only the main play area can receive shadows
Only a few items cast shadows at any one time
Shadows are hard and low-res
No environmental lighting
On PC these rules could be flipped...
We split up the environment to allow higher resolution light maps
The environment lighting is mixed - the majority is still baked, but...
A shadow map lets us use dynamic shadows across the entire scene (compressing shadow maps is not recommended!)
Every dynamic object casts a real-time shadow
Shadows are soft and high-res
The environment is lit by the Skybox
We did a great deal of experimentation to get the baked lighting to look as good as possible, tweaking bounce, environmental light and texel density. Baking light in Unity is something of a dark art, I’d highly recommend
this great tutorial. Top tip: when iterating on baked lighting, use a super low resolution to get a quick read, then, once you are happy with the results, crank it up and go for lunch while it computes the more detailed version. Those high-res maps can take a while to bake, but the results speak for themselves.
Now that you have those fancy shaders, you can leverage them even further by adding dynamic lights. Oculus recommends a maximum of 3 dynamic lights, but, as our environment was optimized for mobile, we were able to get away with even more. These were used for torchlight effects and to make the game pieces cast their own glow. We experimented with Forward and Deferred rendering, but there didn’t seem to be much of performance difference either way.
Visual Effects & Grading
So everything is now nicely lit, time to upgrade your FX.
We switched out our mobile skybox to a
dynamically generated one, with drifting clouds, a sun that is in the correct place for the lighting, and a tunable gradient for smooth fog blending. Then we swapped most of the low-poly, unlit effects for higher detail versions with fancy, lit shaders (most notably on the petals, dust motes and cauldrons).
Finally, you get to use image
post-processing! This is easy to set-up and can make a massive difference to the look of your game. Again, the trick here is to not go overboard: a lot of these effects can look really bad in VR (depth of field for example) but we deployed careful colour grading and a subtle bloom to give Tsuro a more dream-like vibe for Rift.
Hi-fidelity Audio
One thing that surprised us when porting to Rift was the difference in sound. Whilst we didn’t have time to investigate
Oculus’ own sound integration, the Go version of Tsuro has a very dynamic soundscape, with some subtle reverb for realism. On PC we found that almost every sound generator needed to be re-tuned to make sure the roll-off was more realistic. Take note: in Unity a sound with logarithmic roll-off will never be truly silent (look at the curves below). We didn’t notice this on Go, but with the increased fidelity of Rift, this detail really popped out like a sore (and very loud) thumb.
Pre-submission Testing
If you’ve already published on Go then you’ll be familiar with Oculus’
VRC tests, which are crucial for maintaining high quality content in their VR ecosystem. It’s worth noting that there are a few subtle differences for Rift, notably having to pause the game and disable the controls if the headset is removed. Be sure to go through
the official checklist and use the auto-validator tool to quickly catch the easy gotchas.
The Oculus Debugging Tool is extremely handy for showing performance data while playing your game; weirdly, we noticed that our title ran a lot smoother when launched from the Oculus menu, as opposed to running the .exe directly.
Once you are confident that your game is hitting 90FPS, it’s time for the final squeeze...
Extra Details
So everything is looking snazzy, and your framerate is silky smooth, but that still shouldn’t cut the mustard. What magic can you bring that will breathe extra life to your experience? The Go version of Tsuro has a surprise feature that everyone seems to really love: a little fox that randomly appears, wandering about the temple grounds. For the Rift version, we decided to give him some buddies - Koi in the pond, Finches on the fences and a bumble bee that buzzes lazily from flower to flower.
As an added bonus, with a little optimization this enhanced menagerie managed to find its way back into the Go version.
Finally, we took a little time to create some Custom Items for Oculus Home. We used Boxshot to convert the standard FBX format (usually used for our game assets) into GLB files that we could test locally, then upload to the Oculus Dash. We linked the items to some of Tsuro’s cooler achievements, and now players can unlock the Tsuro board game and some lovely Japanese dioramas to decorate their virtual space.
With 6DOF locomotion, haptic feedback, PBR shaders, carefully considered lighting, enhanced visual effects, re-balanced sound, VRC compliance and a little extra pizazz, Tsuro feels more immersive than ever. With the extra detail and polish, that initial screenshot is looking mighty fine...
We are thrilled to present “The Game of The Path” on Rift, and hope the enhanced experience will enthrall gamers even more than its mobile counterpart already has.
If you’d like to learn more about making great, accessible VR experiences, leave a comment below, follow @Thunderbox_ent on Twitter, or drop me a line on dan [at] thunderboxentertainment.com - I’m always happy to chat!
Quarterly Developer Recap: Tracked Keyboard, Hand Gestures, WebXR PWAs and more
Unlock the latest updates for developers building 3D, 2D and web-based experiences for Horizon OS, including the new tracked keyboard, microgesture-driven locomotion, and more.
GDC 2025: Insights for Creating, Monetizing and Growing on Meta Horizon Worlds
Get insights from GDC on how Meta Horizon Worlds’ unique development path enables you to build quickly, broaden your reach, earn revenue and measure success.
All, Design, GDC, Games, Marketing, Mobile, Multi-User, Optimization, Quest