Sounds are an elemental part of human experience – they can surprise, exhilarate, frighten, attract, and inspire. From the thoom of thunder to the scrape of dry leaves blowing across the pavement, sounds affect our thoughts and feelings, seamlessly becoming our daydreams and memories, and awakening within us a deep sense of presence.
Good sound design has long been a key component of film and video game production, and it is at least as important for virtual reality applications. We are committed to providing tools to help developers move forward into this exciting frontier and to address its new challenges.
Earlier this year, we released our Audio SDK, providing Oculus Spatializer Plugins (OSPs) for FMOD, Wwise, and Unity. These plugins help sound designers apply high-quality 3D spatialization properties to their audio content in a format that is ready for use in VR games and applications. However, most actual sound design and authoring takes-_ place in a Digital Audio Workstation (DAW), before the content reaches the audio middleware or game engine.
The Oculus Spatializer Plugin for DAWs is a new addition to the Audio SDK that bridges this gap in sound design workflow. It allows sound designers to preview 3D spatialized sounds during the design phase, before their content reaches the audio middleware or game engine level, thus tightening the iteration loop between design and production. This is analogous to a 3D game artist working in Maya using the same shaders that will be used in the final game.
The OSP for DAWs is compatible with most major DAWs, including VST (Nuendo, Cubase, Ableton Live, Reaper, et cetera) and AAX (Avid Pro Tools).
Let’s say a sound designer is tasked to create a scary zombie noise in a VR horror game that will sound like it comes from behind the player – that is, it will be used as a positional (or spatialized) sound effect. In a typical work flow, the designer begins with a DAW such as Pro Tools, Ableton Live, or Cubase, where the sound effect is created from scratch, using a combination of software instruments, samples, and DSP effects. Once the sound is finalized in the DAW, it is exported as a raw audio file that may be imported into an audio middleware or game engine editor. Normally, this is the earliest that the OSP can be applied to the sound effect in the game, finally giving the developer a chance to hear the 3D spatialized sound.
The Oculus Spatializer DAW plugin gives the sound designer a chance to apply 3D spatialization to the sound as soon as the original sound is authored.
This can be very useful during the sound design process, as it allows the designer to hear the sound in 3D space before it is played in the virtual game world. The sound designer can now have a better feel for how the sound will sit in the overall mix.
Using the Oculus Spatializer Plugin for DAWs
Once installed, the OSP may be applied as an insert effect on a mono track in your DAW (note that the sound must be mono, or single-channel, to allow for correct spatialization). Once placed, the plugin UI provides access to several parameters, including gain, near/far distances, 3D position, reverb, and reflection. The 2D grid (with views for both X/Z and X/Y planes) provides a visualization of the current 3D position and near/far settings, and allows for zooming in and out with Scale(m) knob.
All parameters can either be controlled manually, by an external MIDI controller, or by automation. Parameter automation is a great way to set up scenarios of sounds moving through 3D space and/or in and out of different reverb environments.
Note: Some DAWs, such as Ableton Live, do not distinguish between stereo and mono audio tracks, instead allowing you to place stereo OR mono audio clips within the same audio channel. In such cases, the OSP should only be used on a particular audio track when playing a mono audio clip. The OSP may also be placed on MIDI channels, which can also operate as either stereo or mono. Similarly, the OSP should only be used on MIDI channels which employ a mono sound source.
Example with Ableton Live
We created an example project in Ableton Live to show how parameter automation can be used to “animate” a sound’s position in 3D (video below). A simple looping sound was set up using Ableton’s built-in Drum Rack on a new MIDI channel. The Drum Rack is basically an array of MIDI note trigger pads. A single sample can be assigned to multiple pads, or each pad can be assigned a unique sample. Each pad can have its own set of effects plugins.
This is a useful tool for quickly previewing sounds using different OSP parameter presets. For example, you might drop the same sample and an instance of the OSP onto pads C0 and D0, and then set C0 to a position in front of the listener, and D0 behind. By triggering the sounds from a MIDI controller keyboard in sequence, you would have a side-by-side comparison of the results of the two positions.
For this demo, we kept it simple and only used a single pad with a mono clip and the OSP. The sound properties are set up such that the sound is sustained (that is, continues playing) during the parameter automation. We automated the X, Y, and Z positions to create the effect of the sound moving in a circular motion around the listener, while changing elevation. The automation curves for each parameter are shown in the image below.
This video shows the automation in action – please use headphones:
The sound position begins directly in front of the listener, as shown in the 2D XZ (top-down) view in the plugin UI. The sound position moves in a clockwise motion around the listener, controlled by the top two automation tracks, labelled xpos and zpos. The ypos automation track at the bottom controls the sound’s elevation, as you can clearly see when the 2D grid view is switched from XZ to XY, at around 20 seconds.
OSP availability in the DAW workflow phase helps sound designers author content and make necessary adjustments for creating a great sounding mix in the final VR experience. For installation and usage instructions, to learn more about our other spatializer plugins, or to read about sound design for virtual reality in general, please see our Audio SDK Documentation.