Forum rules - please read before posting.

Sync Face Expressions with multiple models

Hi there! I created my main character model using 2 meshes for his face: one is the regular face, and the second one is a moustache. I'm using the text speech lip sync method.

Both of them have expressions and phonemes properly set, yet when I use the tokens in Dialogue: PlaySpeech, I only get the face mesh to do the expression and lip sync, but not the moustache mesh. They both have the same number of blendshapes properly set and they work when I use an object: shapeable node to establish the expression, but not when I use Dialogue, which would be the most proper way to handle the lip sync.


Is there a way to sync both meshes?

Comments

  • Welcome to the community, @fdslk.

    The expression system indeed assumes that there's only one Shapeable component in the character's Hierarchy, and generally it's best to rely on one character mesh if you can.

    However, if the Blendshapes in your face and moustache models match up, then a simple script should be able to transfer the values from the face to the moustache via an Update() call.

    Something like this should do it:

    Bear in mind that this would override control over the moustache model's Shapeable component, so you should remove that one and just rely on the face.
  • Thanks Chris! I wasn't sure if that was part of the system or not, I didn't expected the extra script and it works like a charm. Both Blendeshapes now are in sync using the Play:Speech node!

    I would even tho' advice you to have this script as part of Adventure Creator, since a lot of 3D Modelers prefer to have several facial details a separate models for better handling their proper blendshapes and textures.
Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Welcome to the official forum for Adventure Creator.