The Oculus Lipsync Unity plugin is a tool used to sync avatar lip movements to speech sounds. Oculus Lipsync analyzes an audio input stream from a microphone input or an audio file, and either offline or in real-time predicts a set of values (called visemes) which may be used to animate the lips of an avatar.
Oculus Lipsync offers a Unity Plugin for use on Windows or macOS.
The following sections describe the requirements, download and setup for development with the Oculus Lipsync plugin for Unity.
The Oculus Lipsync Unity integration requires Unity 5.x Professional or Personal or later, targeting Android or Windows platforms, running on Windows 7, 8 or 10. OS X 10.9.5 and later are also currently supported. See Unity Compatibility and Requirements for details on our recommended versions.
To download the Oculus Lipsync Unity integration and import it into a Unity project, complete the following steps.