Today we are joined by the team from Darkwind Media, as they share the challenges and lessons learned from developing a basic occlusion culling system for mobile VR. Be sure to return next week for part 2 of this series, when the team outlines how to extend the system even further.
Hello! We at Darkwind Media worked hard to bring Gear VR and Go owners a chance to
play Camouflaj’s Republique in VR. Along the way we had to explore every possible avenue to increase the performance of this graphics-heavy game, but none worked as well for us as
the Dead Secret method of occlusion culling. This article is part one of a two-part series that showcases how we developed a custom occlusion culling solution tailored to fit our specific needs, and the lessons we learned along the way. Part one will go over the specifics of how the basic system functions, while part two will detail more advanced usage in Republique, exploring how the system can be expanded to deal with more complex needs in the future.
Republique is played by hacking into any of the fixed CCTV cameras within the Metamorphosis facility, to guide the main character Hope to safety. Along the way, the player uncovers the hidden truths of Metamorphosis through many hours of recorded audio, and learns the dystopian pitfalls of her surveillance state.
Republique is the original “AAA mobile game,” and its environments were built from the start with a hierarchy of divisions. Each episode consists of about 3 to 5 areas, each area is divided into many rooms, and each room has anywhere from 72 to over 1000 renderers. Below is a demonstration of the rooms in the Brig/Power Station area. This area:
- Features 53 fixed security cameras
- Is divided into 17 rooms
- Contains a total of:
- 3,989 opaque renderers.
- 511 alpha-clipped renderers.
- 88 alpha-blended (or “transparent”) renderers.
- 67 other renderers which we exclude from occlusion culling
The player and Hope may only be in one area at a time; however, each camera has a hand-authored list of rooms to be loaded together while the player is viewing through that camera. This generally results in cameras loading the current room, plus a few other nearby rooms which can be seen through doorways or arches. Distant doorways often do not have a real room behind them, instead they have a “closet” -- a simple extrusion with a cubemap image of the room projected onto it.
Throughout development and testing, Republique VR was CPU-bound as often as it was GPU-bound. We eventually noticed that culling was taking a fairly significant amount of CPU time every frame. It was easy to miss, because it’s not just “culling” in the profiler; there’s a few other line items that can add up to a whole extra precious millisecond! But since we had no occlusion culling set up at this point… all of this time was spent just to do frustum culling.
So what could we do about it? Often a game camera, despite being very close to an adjoining room, can see only a very small portion of that room. This results in a lot of extra objects, such as furniture, tabletop clutter, foliage and even wall segments being loaded even though the player cannot possibly see them around the corner. But Unity doesn’t know that! So it tests whether they are in the frustum… every frame.
We needed some way to shut off the excess renderers so Unity wouldn’t have to deal with them. We needed occlusion culling. With it, Unity would have far fewer objects to test and sort through every frame. Now, Unity’s own occlusion culling system is surely fine for some, but because it’s built to accommodate all kinds of games, it can’t make assumptions about where the camera is and how it moves. And unfortunately we just couldn’t justify anything that would have a CPU cost every frame.
For Republique, nearly every camera through which the player can see is fixed in location! We don’t really need to update the occlusion every frame… In fact, to free up precious runtime cycles, we should be able to calculate every renderer the player could possibly see from a given camera location ahead of time. This is called a
potentially visible set, and is essentially just a list of every renderer that should be turned on when the camera is in a certain point (while the remaining renderers should be turned off).

In the image of Unity’s profiler below, the left image is an example scene without occlusion culling and the right example is with our occlusion culling implementation. The yellow is “Vsync” or time the CPU spent waiting for the GPU to finish, while the bright green is time the CPU spent preparing and giving commands to the GPU. In this scene, you can see the green bar -- which includes culling -- became considerably smaller as a result of occlusion culling. In addition, the CPU spent a lot less time waiting for the GPU, from which we can deduce the GPU’s work was also significantly reduced.
How Our Solution Works
Note: all pseudo code is provided for demonstration purposes only and is not guaranteed to be complete, optimized or fit for any particular use.
To make a potentially visible set, we have the GPU draw the scene in such a way that we can check to see which renderers ended up in the final image. We do this by assigning a unique, solid color material to every renderer possible.
Begin()
{
For each renderer...
Pick a unique random color that isn’t black.
Make a dictionary mapping color → renderer.
Make a dictionary mapping renderer → color.
Create replacement materials for each material used by the renderer.
}
We then capture a full 360° image of the world from one point. To do this, we tell a camera to draw into a square buffer with 90° field of view in 6 opposing directions. It is very important to disable anything that might result in colors blending, so we take care to disable antialiasing, create a new camera with no post-fx attached, and use full 32-bit color textures:
DrawCubemap(Vector3 position, Camera gameCamera)
{
Create new camera with no other scripts, copying from the game camera;
Give camera 90° FOV, No MSAA, and clear to black;
Create 6 512x512 textures with no mipmaps and ARGB32 format;
Render camera into textures with rotations:
Quaternion.identity;
Quaternion.AngleAxis(90, Vector3.up);
Quaternion.AngleAxis(180, Vector3.up);
Quaternion.AngleAxis(270, Vector3.up);
Quaternion.AngleAxis(90, Vector3.right);
Quaternion.AngleAxis(-90, Vector3.right);
}
Once we have the textures, we simply read them as color arrays, match those colors to renderers and add them to a HashSet. That completes our first attempt at a potentially visible set!
CreatePotentiallyVisibleSet(Vector3 position, Camera gameCamera)
{
Enable all renderers;
DrawCubemap(position, gameCamera);
Create empty visible set;
For each texture in the cubemap…
For each pixel in the texture…
Match the color to a renderer and add it to the visible set;
}
To cull the game while it is running, we simply iterate over every renderer, enabling it if it is in the potentially visible set, or disabling it if it is not. In Republique, this is done during the brief fade to black when switching cameras. You’ll have to decide the best time to do this in your own game.
// Serialized data.
List allControlledRenderers;
HashSet potentiallyVisibleSet;
// Note that Unity will not serialize a HashSet. You’ll have to serialize it as an array or list and turn it into a HashSet in Awake.
Cull()
{
For each controlled renderer...
Set renderer enabled if potentiallyVisibleSet contains renderer, disabled otherwise;
}
Now let's try it!
Notice the empty spot behind the glass window? Since we rendered transparent objects as a solid color, we accidentally allowed them to occlude renderers behind them. To solve this, we have to draw the colorized scene without transparent objects so they don’t occlude anything. But we still want to know if the transparent objects themselves are occluded, so we also need to draw the colorized scene with them. In other words, we want transparent objects to be occludees, but not occluders.
How do we resolve this? First, we capture the scene with all renderers enabled, including transparent renderers, and make a visibility set. We know for sure that any transparent renderers seen in this capture are not occluded by opaque renderers, so we turn them off and capture again, taking the union of both visibility sets. Any newly seen renderers (both transparent and opaque) were occluded only by the transparent renderers we just turned off, so once again, we turn off the newly seen transparent renderers and capture once more. We keep doing this until no new transparent renderers are seen. When this is accomplished, we know any remaining transparent renderers are definitely occluded by opaque renderers. And because the last capture could not see any transparent renderers, we also know any opaque renderers we haven’t yet seen are definitely occluded by opaque renderers, not transparent ones.
CreatePotentiallyVisibleSet(Vector3 position, Camera gameCamera)
{
Enable all renderers;
Make a list of only transparent renderers;
Create empty visible set;
Do
{
DrawCubemap(position, gameCamera);
Create a temporary empty set;
For each texture in the cubemap…
For each pixel in the texture…
Match the color to a renderer and add it to the temporary set;
For each transparent renderer in our list…
If the renderer appears in the temporary set…
Disable it and remove it from the list;
Join (union) the visible set with the temporary set;
}
While (count of transparent renderers seen in the last iteration > 0);
}

Compared to simply capturing the whole scene once per transparent object to determine whether it is occluded or not, this solution maximizes the number of renderers for which we can determine occlusion status at one time. Meanwhile, compared to capturing the whole scene once with transparent renderers and once without, this method helps to avoid incorrectly occluding transparent renderers that are completely covered by other transparent renderers, such as bottles or glasses in a display case.
Success! We’ve implemented a very basic occlusion culling system!
In the next article we will discuss the limitations of this method of occlusion culling and propose a way to extend it. We hope reading this has inspired you to get creative with performance optimizations in your own games. We’d love to hear from you, so be sure to reach out & follow Darkwind on
Twitter and
Facebook, where we post about some of our other projects, as well as the occasional game jam entry or developer blog. Thanks for reading and good luck with your own mobile VR endeavors!
- The Darkwind Team