Common Rendering Mistakes: How to Find Them and How to Fix Them
Oculus Developer Blog
|
Posted by Trevor Dasch
|
August 22, 2019
|
Share

So you’ve mastered your assets, you’ve designed your levels, your performance is a solid 72fps. But you put on the headset and it looks terrible! As a Developer Relations Engineer at Oculus, I see this problem all the time. In this article I will provide the tips and tricks for making your game look its best.

Jaggies everywhere!

You load up your game, and your menu is a sparkly mess, and your environment looks like a saw-toothed nightmare. What’s the deal?

Geometry Aliasing - what it is and why it happens

These ‘jaggies’ are caused by aliasing, which happens when you rasterize an image onto a 2D grid of pixels. Each pixel can only have one color, so the GPU selects the color that is in the very center of the pixel as each shape is drawn. When the computer renders your geometry it turns your 3D meshes into a series of 2D triangles, which the center of each pixel is either in, or its not in, thus the saw tooth pattern.

How do we avoid this? There is a technique known as MSAA (multisample anti-aliasing) that solves the problem by determining if multiple points per pixel fall inside your polygon, then contributing a percentage of your polygon’s color to the final rendered pixel proportional to the number of covered points. So for 2x MSAA, two points are checked to see if the triangle is in them, if it covers both, it applies 100% of the color, if only one sample is in the triangle, 50% of the pixel color comes from that triangle. The color of the triangle is determined only once for all the samples, so it’s only slightly more expensive on the GPU than using a single sample. We recommend 4x MSAA as the quality difference makes the extra cost worth it.

Ok, so you’ve turned on MSAA and now your world and characters all look smooth, but your UI is still a shimmering mess. I thought you said that would fix my problems?

Right, so it turns out MSAA is not a silver bullet. It doesn’t apply to transparent objects because the implementation requires writing to the depth buffer, which you can’t do with transparency. For any of your solid UI quads, just make them opaque. This kills two birds with one stone: you’ve fixed aliasing, and eliminated a case of overdraw. Go you! But if your background is semi-transparent, or the corners are rounded off, the easiest fix is to add a border of transparent pixels around all of your UI sprites. It’s a pain to go back and remaster your art, but in the end it will look cleaner than MSAA, since the alpha of each edge pixel will be perfectly correlated to the pixel coverage of your sprite. As an example, an extra row of transparent pixels can make a huge difference with TextMeshPro components in Unity, it’s as easy as checking the checkbox for “extra padding” on your TMP component. Check that now and thank me later!

MSAA also doesn’t apply to Alpha Cutout materials since the clip happens per pixel. It turns out there’s a nifty thing you can do called Alpha-To-Coverage, or as Unity calls it, AlphaToMask. This enables you to manually tell your GPU how many MSAA samples you want to cover with your pixel by setting your alpha value. For example with MSAA 4x, if I want to do a 50% blend, I would just set my alpha to 0.5 and for 75% opacity my alpha would be 0.75. This sounds complicated, but it actually makes swapping out alpha clip a breeze, you just make your shader output the alpha channel value you retrieved from your texture sample. This works because the raw texture pixels are either 0 or 1 alpha, so the alpha result from bilinear interpolation directly corresponds with the coverage of your pixel.

Technical Artist and generally awesome fellow Ben Golus wrote the ideal introduction to Alpha-To-Coverage. If you’re looking for more information on the subject I highly recommend it.

Shimmery, sparkly textures

Now you’ve resolved all your jaggies but you’re still seeing weird sparkles on some images until you get right up close to them. What gives?

Texture Aliasing

Texture aliasing, like geometry aliasing, happens when rasterizing to a 2D grid of pixels, though this is completely avoidable. Most textures (as is the default setting in Unity) are bilinearly sampled. If the pixels line up, the output texture will match the exact same pixel layout as the source image:

If the source image is sampled slightly offset (which will happen most of the time with a free camera), every pixel in the output will be a blend of the four pixels in the source image that overlap the resulting pixel in your render target.

This means that if the image resolution you are sampling matches the image resolution of your render texture, each pixel will have 100% contribution to the output image pixels, even if the source texture doesn’t line up perfectly with the destination render target. If the texture you’re sampling is lower resolution than the output image, each pixel in the source image will contribute to more pixels in your output image, but each pixel will still have an equal contribution. However, if the texture you are sampling is higher resolution than the render texture, your pixels will not be sampled in equal measure, and eventually some pixels won’t be sampled at all. This uneven sampling causes aliasing.

How do you take advantage of this information? If your texture is at a fixed distance, you can just pick the perfect resolution so that it samples at a 1:1 pixel resolution match. For example, the Unity splash screen with a custom icon. I’ve seen way too many over aliased screen images, but there’s no need for that, just make sure the texture size is correct and it will look amazing. On the other hand, if you can get closer or further from a texture, which is 99.9% of textures in VR, there’s simply no way you can pick one perfect resolution. So what we do is take our full resolution image, then create a half sized version for when you get further away, then one half the size of that, and so on. These are called mipmaps, and GPU’s support swapping between them automatically, so you’re always sampling the right size image. In Unity, you just need to hit the ‘generate mipmaps’ checkbox on each texture, and it will do this for you. But what about textures you load in dynamically? It’s possible to generate your mipmaps at runtime, which is usually the right choice to pay the small cost one time per texture, but sometimes you can’t, for example playing a video the texture changes every frame. In this case, sometimes it’s better to change your shader to perform multiple samples per fragment to reduce aliasing at most distances. This isn’t cheap on the GPU, so you should save it for cases where you really need it!

This can be achieved in your fragment shader by using ddx/dfdx and ddy/dfdy to pick 4 sample points that represent 4 quadrants of your pixel. For Unity shaders, you can drop this function in, then swap out your calls to tex2D with tex2Dmultisample.

fixed4 tex2Dmultisample(sampler2D tex, float2 uv)
{
	float2 dx = ddx(uv) * 0.25;
	float2 dy = ddy(uv) * 0.25;

	float4 sample0 = tex2D(tex, uv + dx + dy);
	float4 sample1 = tex2D(tex, uv + dx - dy);
	float4 sample2 = tex2D(tex, uv - dx + dy);
	float4 sample3 = tex2D(tex, uv - dx - dy);
		
	return (sample0 + sample1 + sample2 + sample3) * 0.25;
}

Everything is Blurry!

So you’ve resolved all of the aliasing in your game. All of your lines are clean and smooth but for some reason your textures are all blurry unless you get up close. To make matters worse, where some of your textures like a floor are located, you’re seeing these weird seam-like edges going from slightly blurry to even more blurry. And what’s worse, it’s a really noticeable pop when they switch from blurry to clear. Don’t worry, this is all really easy to fix!

That “Pop”

When you first turn on mipmaps, you’ll see the “pop” when the GPU decides to switch between the different mipmap levels. It’s really obvious in VR, and it looks pretty bad. This is due to the setting the mipmap selection function to ‘nearest’, which Unity does when the filtering is set to its default ‘bilinear’ setting. However, there is a trilinear filtering option. Because you’re almost never going to be at exactly the right distance for one mipmap, trilinear filtering samples the two closest mipmaps for your current distance, one slightly too high res, and one slightly too low res, and blends between them linearly based on distance. What this does is give a nice smooth transition between all of your mipmap levels, as well as give a slightly crisper image for a little longer, making things look way better. There is a slight performance cost to this, but realistically you probably won’t detect it. This also smooths out that weird seam you can sometimes see when a different mipmap is used on the same object. Now it should be a nice smooth transition.

Blurry Floors (and other things)

Have you noticed that when you look at the floor, especially in the distance it tends to look much blurrier than it should? But when you look straight down it looks fine... The reason for this lower resolution, is that when you’re looking at something at a sharp angle, the sample rate of pixels is much higher in one dimension of the texture, and much lower in another, and so it picks the lesser of the two and uses that to pick the mipmap. However, there is a way to make it smarter about selecting each mipmap level, and that is with anisotropic filtering. Anisotropic filtering is a mildly expensive technique, where multiple samples may be taken in order to avoid aliasing in only one direction. It’s not necessary to use on every object, but it is good to use on most things that can be viewed at oblique angles. It can make a massive quality difference!

The gif below provides a side-by-side comparison of applying a trilinear filter (left) vs a trilinear with anisotropic filter (right).

Sometimes you just have to put your finger on the scale

Your settings are all good to go, but it still seems like some things are more blurry than you’d like and you still need to get really close for them to appear sharp. Thankfully you can put your finger on the scale a little bit when it comes to mipmap selection, and we do this by specifying a mipmap bias. A bias less than zero will make the GPU select a higher resolution texture at a greater distance, and a bias greater than zero will do the opposite. In my experience a bias of -0.7 seems to do the trick quite nicely for detailed textures to keep clarity higher at greater distances, though you can start to see aliasing again once you get past a certain point. Feel free to play around with this number for your use case, you can also combine the mipmap bias with a multisampling shader if you can afford to pay that cost and your experience requires the added quality. The gif below provides a side-by-side comparison of a trilinear filter w/ no bias (left) vs one with a -0.7 bias (right).

For higher quality at the trade-off of increased rendering cost, you can even combine mipmap bias with a supersampling shader. Once again, Ben Golus has created a great reference for taking advantage of this technique: Sharper Mipmapping using Shader Based Supersamplingc.

Give me the TL;DR

You’re in a rush, and you don’t need the lecture. You just want to know what you should do. Well here it is:

  • Turn on 4x MSAA, it’s absolutely worth it
  • Alpha blend doesn’t get MSAA, work around it with transparent borders
  • If you’re using Unity’s Text Mesh Pro, turn on Extra Padding on all instances
  • Alpha To Coverage looks better than Alpha Cutout
  • Do this for all textures:
    • Enable mipmaps
    • Enable trilinear filtering
  • On your environment textures:
    • Turn on anisotropic filtering
  • On high detail textures:
    • Set your mipmap bias to -0.7
  • Pick the correct resolution for your Unity splash screen logo!

Conclusion

You have now been equipped to identify the small details that really detract from the appearance of your game. The downside is, just like bad kerning, you’ll never be able to unsee a few of these mistakes when you spot them in other games. From here on out it is your duty to spread the word on best practices. Don’t let me down, soldier!

Signing off,

- Trevor Dasch