Meta XR Audio SDK Overview
Audio is crucial for creating a persuasive VR or MR experience. Because of the key role that audio cues play in our sense of being present in an actual, physical space, any effort that development teams devote to getting it should be worth the effort, as it will contribute to the user’s sense of immersion.
Meta provides free, easy-to-use spatializer plugins for engines and middleware including Unity, Unreal, FMOD, and Wwise. Our spatialization features support PCVR as well as Quest, Quest 2, Quest Pro, Quest 3, and Quest 3S development.
By the end of this document, you’ll be able to:
- Implement the Meta XR Audio SDK in your project so the user can localize audio sources in three-dimensional space.
- Define spatial audio and what developers can do with it.
- Define default Meta XR Audio SDK behavior.
- Explain how and why to use Meta XR Audio SDK.
Note: If you are just getting started with this Meta XR feature, we recommend that you use
Building Blocks, a Unity extension for Meta XR SDKs, to quickly add features to your project.
The Meta XR Audio SDK provides spatial audio functionality including head-related transfer function (HRTF) based object and ambisonic spatialization, and room acoustics simulation. It is a replacement for the Oculus Spatializer plugin.
The ability to localize audio sources in three-dimensional space is a fundamental part of how we experience sound. Spatialization is the process of modifying sounds to make them localizable, so they seem to originate from distinct locations relative to the listener. It is a key part of creating a sense of presence in virtual reality games and applications and the Meta XR Audio SDK simplifies the process to acheive it.
The steps required to implement the Meta XR Audio SDK will vary slightly depending on which audio engine you are using. Follow the links below to the steps for your project:
Sound design and mixing is an art form, and VR is a new medium in which it is expressed. Whether you’re an aspiring sound designer or a veteran, VR provides many new challenges and inverts some of the common beliefs we’ve come to rely upon when creating music and sound cues for games and traditional media.
Read the
Introduction to VR Audio white paper for key ideas and how to address them in VR, or see any of the additional topics:
Topic | Description |
---|
| Describes how humans localize sound. |
| Describes spatialization and head-related transfer functions. |
| Describes different listening devices and their advantages and disadvantages. |
| Describes environmental modeling including reverberation and reflections. |
| Now that we’ve established how humans place sounds in the world and, more importantly, how we can fool people into thinking that a sound is coming from a particular point in space, we need to examine how we must change our approach to sound design to support spatialization. |
| As with sound design, mixing a scene for VR is an art as well as a science, and the following recommendations may include caveats. |
| Definitions of technical terms VR audio terms. |