The Oculus Lipsync Unreal plugin is a tool used to sync avatar lip movements to speech sounds. Oculus Lipsync analyzes an audio input stream from a microphone input or an audio file, and either offline or in real-time predicts a set of values (called visemes) which may be used to animate the lips of an avatar.
Oculus Lipsync offers an Unreal plugin for use on Windows or macOS.

The following sections describe the requirements for Unreal, and how to download and setup Oculus Lipsync plugin for Unreal Engine.
The Oculus Lipsync Unreal plugin is compatible with Unreal Engine 4.20 or later, targeting Android, Windows and macOS platforms. See the Unreal Game Engine guide for more details on the recommended versions.
To start using Oculus Lipsync in your Unreal project:
[download-dir]\LipSync\UnrealPlugin\OVRLipSyncDemo\Plugins.[Install-Directory]\Epic Games\UE_x.xx\Engine\Plugins. For example for Unreal version 4.20 on Windows, you could find this folder at the following location, C:\Program Files\Epic Games\UE_4.20\Engine\Plugins.Create a new project or open an existing project in Unreal Engine. From the Edit menu, select Plugins and then Audio. You should see the Oculus Lipsync plugin as one of the options. Select Enabled to enable the plugin for your project.
The following image shows an example.
