Oculus Lipsync Unreal Integration 1.43.0
Oculus Lipsync is an add-on plugin and set of scripts which can be used to sync mouth movements of a game or other digital asset to speech sounds from pre-recorded audio or live microphone input in real-time.
For documentation and more information, see the Oculus Lipsync Unreal Integration Guide.
- General bug fixes and improvements.
- Added May 26, 2020. The sample code in this version of Lipsync is not compatible with Unreal Engine 4.25.
- Avatar lips don't move according to user's speech in a launched Android app using live capture.
As a workaround, you can use adb shell to grant the Lipsync package access to recording audio. Example:
adb shell pm grant [lipsync.package.name] android.permission.RECORD_AUDIO
This will force the permission to allow the use of the microphone,and lips move with the user's voice.