Oculus Lipsync for Unity

The Oculus Lipsync Unity plugin is a tool used to sync avatar lip movements to speech sounds. Oculus Lipsync analyzes an audio input stream from a microphone input or an audio file, and either offline or in real-time predicts a set of values (called visemes) which may be used to animate the lips of an avatar.

Oculus Lipsync offers a Unity Plugin for use on Windows or macOS.

The following sections describe the requirements, download and setup for development with the Oculus Lipsync plugin for Unity.

Requirements

The Oculus Lipsync Unity integration requires Unity 5.x Professional or Personal or later, targeting Android or Windows platforms, running on Windows 7, 8 or 10. OS X 10.9.5 and later are also currently supported. See Unity Compatibility and Requirements for details on our recommended versions.

Download and Import

To download the Oculus Lipsync Unity integration and import it into a Unity project, complete the following steps.

  • Download the the Oculus Lipsync Unity package from the Oculus Lipsync Unity page.
  • Extract the zip archive.
  • Open your project in the Unity Editor, or create a new project.
  • In the Unity Editor, select Assets > Import Package > Custom Package
  • Select the OVRLipSync.unity package in the LipSync\UnityPlugin sub-folder from the archive you extracted in the first step and import. When the Importing Package dialog opens, leave all assets selected and click Import.
Note: We recommend removing any previously-imported versions of the Oculus Lipsync Unity integration before importing a new version.
Important: If you wish to use both OVRVoiceMod and OVRLipsync plugins, you should install the Unity unified package.