Using Lip Sync Integration

To use the Lip Sync integration, a scene must include the LipSyncInterface, the main interface to the OVRLipSync dll. A prefab is included in the integration for convenience.

OVRLipSyncContext must be added to each GameObject which has the morph or texture target that you want to control.

OVRLipSyncContextMorphTarget and OVRLipSyncContextTextureFlip are the scripts that bridge the viseme output from OVRLipSyncContext.

OVRLipSyncContextMorphTarget requires a Skinned Mesh Renderer, which should have blend targets assigned to it (see the Prefab LipSyncTarget_Female for an example). The mesh should include all 15 visemes generated by the OVRLipSyncContex - expand BlendShapes in the head_girl Inspector view to access:

Each blend target from sil to ou represents a viseme generated by the viseme engine. You may view each one by setting the blend target for a single viseme to 100.0 Note that sil corresponds to the silence, i.e., the neutral expression, and setting it to 100 with all other values 0 will have no visible effect.

You may use more than 15 blend shapes in a target, and in fact we recommend doing so, in order to add facial expressions and blinking to the avatar. Notice that not every blend shape is a viseme - for example, blinkR and blinkL control the eyes.

Select LipSyncMorphTarget_Female under Prefabs and, in the Inspector, find the attached script OVR Lip Sync Context Morph Target and expand it to see a map of the viseme outputs to the blend shapes:

Notice that Element 0 (which represents the sil viseme) has an index of two - this assigns which blend target in the model is to be influenced with the viseme blend value.

Now select LipSynchMorphTarget_RobotTextures in Prefabs to view OVR Lip Sync Context Texture Flip in the Unity Inspector. Expand Textures:

OVR Lip Sync Context Texture Flip requires the Material you wish to target textures with, and a set of textures. These textures must be set within the Textures field, and must match the texture which you want to associate with a given viseme:

The logic within the TextureFlip script only chooses one texture to use on a given frame, and assigns it to the main material texture, which should be assigned to the model which uses the texture for drawing the avatar lips.

This type of avatar is somewhat cartoon-like and is a good fit with scenes that include a large number of avatars, such as social scenes.

Other OVRLipSync Scripts

OVRLipSyncMicInput is for use with a GameObject which has an AudioSource attached to it. It takes input from any attached microphone and pipes it through the AudioSource.

Note that an AudioSource must be available to use the OVRLipSyncContext script, as the system relies on the function OnAudioFilterRead to analyze the audio and return the viseme buffers which drive the morph or texture targets.

We recommend looking at the other scripts included with this integration. They will provide more insight as to what is possible with OVRLipSync. We include, for example, some helper scripts to facilitate easy on-screen (in VR) debugging.