Hi all.
I've searched through the forum, but cannot find anything that relates to integrating my own lip-sync solution into AC.
Background: I'm using Spine and I have set my character's mouth positions up as attachments in a mouth slot. I am going to use Papagayo to create the lipsync files.
The idea I had was to basically have a script that listened to a call from AC that basically says "character X is now talking and is saying this particular bit of dialogue". It would be great if this existed. That way I can make this Spine Lipsync Integration script tell the particular character what mouth slot to be in at a particular moment in time (based on the Papagayo dialogue file).
Thanks
Comments
As AC characters can speak over each other, the best solution would probably be to create per-character functions that can be used to read a character's current line, and line ID (if generated by the Speech Manager). eg:
GetCurrentSpeechText ()
GetCurrentSpeechID ()
Would those suffice?
If the broadcast was made on the character's gameobject, you could presumably omit the Character as an argument. Technically the ID number would be enough, as you can use the speech manager to extract the text itself. So would this be more like it?
AC already makes use of delegates, actually - allowing you to override the input system. I'm keen for AC to adopt a more "generic" approach to integrations, rather than enforcing specific assets, so extending this would be OK in principle. It does bring the issue of "where do you draw the line?" however.
Being a somewhat larger issue than this first looked to be, I can't say if anything shall be included in v1.51, but I shall certainly have a look.