Oculus Lipsync Unreal Integration v20
Oculus Lipsync is an add-on plugin and set of scripts which can be used to sync mouth movements of a game or other digital asset to speech sounds from pre-recorded audio or live microphone input in real-time.
For documentation and more information, see the Oculus Lipsync Unreal Integration Guide.
- Fixed sample to enable it on UE 4.25.
- General bug fixes and improvements.
- Added performance enhancements.
- Avatar lips don't move according to user's speech in a launched Android app using live capture.
As a workaround, you can use adb shell to grant the Lipsync package access to recording audio. Example:
adb shell pm grant [lipsync.package.name] android.permission.RECORD_AUDIO
This will force the permission to allow the use of the microphone,and lips move with the user's voice.