The physical side-effects of VR are well-documented and go by many names: simulator sickness, VR sickness, VORTAN, stillness illness and others. Despite extensive research, the precise physical mechanisms are still poorly understood. This is not unique to VR – the causes of most forms of motion-induced illness such as seasickness and spacesickness are also poorly understood. Although we don’t understand the physiology well, we do understand many of the things that cause it. A few of these causes are inherent in the desired experience, but many of them can be solved with good (though complex) engineering. But because all the symptoms are basically the same whatever the cause, it is often difficult to discern whether you’re running into a physiological limit or if you simply have your math wrong. This is especially tricky to track down when many of the symptoms take twenty minutes or more to manifest.
For added challenge, the brain can train itself to ignore all sorts of strange things if given months of daily exposure, which is exactly the sort of exposure you get when developing a VR game. This means that we developers rapidly become immune to all but the most obvious rendering errors, and as a result we are the worst people at testing our own code. It introduces a new and exciting variation of the coder’s defense that “it works on my machine” – in this case, “it works for my brain”. Developers should ideally do a significant amount of testing on “kleenex” subjects before shipping anything, and because peoples’ responses vary so widely, they need to test on multiple players to ensure there isn’t some terrible howler that will make a significant portion of their players sick.
The typical components of problems in VR that we see in the current crop of experiments can be classified into two broad categories – things developers can fix by using good engineering to modify the existing rendering pipelines, and things developers can’t fix without changing the game or the content in some way. The latter is an area that is really fun to explore with new game concepts and ideas, and there are already some fascinating experiments, but they’re often beyond the scope of developers porting an existing game to VR.
Things developers can fix
Even if you think the added “intensity” of some effect is really exciting and immersive, there are many of your players who will just feel sick and will want to stop playing. The variation of intensity felt by different players is huge, and what is mildly interesting for some may be something that others find literally unbearable – they will stop playing your game because of it. Remember how old FPSes used to have “head bob” turned on by default because it was considered more immersive? Developers stopped doing that because it made a certain fraction of players ill, even in 2D. Finally, try to obey the rules at all times – including cinematics (first or third person), menus, and loading screens.
Things developers can’t easily fix
What developers can do about it
Some of these components are fundamentally difficult problems to do with human perception and the limits of technology. But most of them are purely technical aspects that we know how to solve, and we also know that if they’re not done correctly, they cause some of the strongest forms of simulator sickness. Anecdotally, many of the reports of side-effects are due to these well-understood factors, and that is frustrating to us both as a company and as a bunch of geeks that want to see VR succeed. For some components, it’s up to Oculus to improve our tools, libraries and documentation. For others we need the help of developers to do the right thing, use the correct mathematical models, and resist the completely understandable temptation to just “get something on screen”. We’ll go into that in more detail in the SDK and blogs in the future.
We all need to educate new users on what to do before they jump with both feet into virtual worlds. This is difficult, especially right now when people get their exciting new dev kit and there is a room with twenty people all scrambling to try it on. Oculus can help by making the calibration process simpler and more intuitive, and we now have a standalone calibration utility that generates named user profiles that all VR games can read. This allows users to adjust the rendering to their unique facial shape and size, and means they will only have to do it once and can then play without using console variables or editing config files. Please incorporate the data from these profiles into your game code, help us encourage players to use it rather than just accepting the defaults, and please also encourage them to take things slow at first. VR is very intense and can take some getting used to.
In summary – do the math right, don’t cut corners, be kind to your sensitive players, and encourage them to take it slowly at first.
Tom Forsyth, Oculus coder.
 VORTAN is formed from five Rift dev kits, acting as one –dedicated, inseparable, invincible! Alternatively it stands for Vestibulo-Optical Reflex something something Nausea. If anyone can remember what the T & A stands for (behave!), please let me know.
 Pretty sure I saw Stillness Illness open for Sigue Sigue Sputnik at the Apollo in ’83.
 People with this problem can try setting the IPD to zero, which will then present the same image to both eyes. This removes all vergence cues, which decreases immersion, but if it also reduces eye-strain that’s probably a good trade-off. They will still get many useful 3D cues from parallax and translation via the head-on-a-stick model which they would not have from a standard monitor or a 3D movie.