Oculus Go Development

On 6/23/20 Oculus announced plans to sunset Oculus Go. Information about dates and alternatives can be found in the Oculus Go introduction.

Oculus Quest Development

All Oculus Quest developers MUST PASS the concept review prior to gaining publishing access to the Quest Store and additional resources. Submit a concept document for review as early in your Quest application development cycle as possible. For additional information and context, please see Submitting Your App to the Oculus Quest Store.

Set Up Oculus Lipsync for Native Development

Oculus Lipsync analyzes the audio input stream from microphone input or an audio file and predicts a set of values called visemes, which are gestures or expressions of the lips and face that correspond to a particular speech sound. The term viseme is used when discussing lip reading and is a basic visual unit of intelligibility. In computer animation, visemes may be used to animate avatars so that they look like they are speaking.

Oculus Lipsync uses a repertoire of visemes to modify avatars based on a specified audio input stream. Each viseme targets a specified geometry morph target in an avatar to influence the amount that target will be expressed on the model. Thus, with Oculus Lipsync we can generate realistic lip movement in sync with what is being spoken or heard. This enhances the visual cues that one can use when populating an application with avatars, whether the character is controlled by the user or is a non-playable character (NPC).

The Oculus Lipsync system maps to 15 separate viseme targets: sil, PP, FF, TH, DD, kk, CH, SS, nn, RR, aa, E, ih, oh, and ou. The visemes describe the face expression produced when uttering the corresponding speech sound. For example the viseme sil corresponds to a silent/neutral expression, PP corresponds to pronouncing the first syllable in “popcorn” and FF the first syllable of “fish”. See the Viseme Reference Images for images that represent each viseme.

These 15 visemes have been selected to give the maximum range of lip movement, and are agnostic to language. For more information, see the Viseme MPEG-4 Standard.

The Oculus Lipsync offers a library for native C++ development on Windows or macOS.

Animated Lipsync Example

The following animated image shows how you could use Oculus Lipsync to say “Welcome to the Oculus Lipsync demo.”

Laughter Detection

In Lipsync version 1.30.0 and newer, Lipsync offers support for laughter detection, which can help add more character and emotion to your avatars.

The following animation shows an example of laughter detection.

The following sections describe the requirements, download and setup for native development with the Oculus Lipsync libraries on Windows or macOS.


To use the Lipsync native library you must have a C/C++ compiler installed on your development computer. For example, you could use one of the following IDEs:

  • Microsoft Visual Studio C++ 2015 or later
  • Xcode 8 or later
  • Android NDK 12b or later



To setup the Lipsync package for Visual Studio or Xcode, choose one of the following:

  • In Visual Studio, access your Project Properties. In the dialog:
    • Under C/C++ > General, for Additional Include Directories, choose Edit and provide the path to the Lipsync\Include directory (under the extraction directory from the previous step). This will be something like [extraction-dir]\LipSync\Native\Include. The following image shows an example:

    • Under Linker > General settings, under Additional Library Directories, choose Edit and specify the path to lib folder that contains OVRLipSyncShim.lib for your target platform to the list of library dependencies. For example, to add the lib file for Windows 64 development, add [extraction-dir]\LipSync\Native\Lib\Win64.

  • In Xcode:
    • Add libOVRLipSyncShim.a to Link Binary with Libraries, and make it Required. The following image shows an example.

    • For Search Paths, add ../OVRLipSync/Native/package/Include folder to Header Search Paths and ../OVRLipSync/Native/package/Lib/MacOs/ folder to Library Search Paths.The following image shows an example.

Topic Guide

Using Oculus LipsyncUsing The Oculus Lipsync Package
Lipsync SampleExploring Oculus Lipsync with the Unreal Sample
Guide to the APIs for native Lipsync developmentLipsync API Reference
Viseme reference imagesViseme Reference