Oculus Lipsync Unreal

| Published 2020-09-02
I have read and agree to the terms of the License

Oculus Lipsync Unreal Integration v20

Oculus Lipsync is an add-on plugin and set of scripts which can be used to sync mouth movements of a game or other digital asset to speech sounds from pre-recorded audio or live microphone input in real-time.


For documentation and more information, see the Oculus Lipsync Unreal Integration Guide.

New/Updated Features

  • Fixed sample to enable it on UE 4.25.
  • General bug fixes and improvements.

Known Issues

  • Added performance enhancements.
  • Avatar lips don't move according to user's speech in a launched Android app using live capture.
    As a workaround, you can use adb shell to grant the Lipsync package access to recording audio. Example:
    adb shell pm grant [lipsync.package.name] android.permission.RECORD_AUDIO
    This will force the permission to allow the use of the microphone,and lips move with the user's voice.