Forum rules - please read before posting.

Custom Lipsync

Hi all.

I've searched through the forum, but cannot find anything that relates to integrating my own lip-sync solution into AC.

Background: I'm using Spine and I have set my character's mouth positions up as attachments in a mouth slot. I am going to use Papagayo to create the lipsync files.

The idea I had was to basically have a script that listened to a call from AC that basically says "character X is now talking and is saying this particular bit of dialogue". It would be great if this existed. That way I can make this Spine Lipsync Integration script tell the particular character what mouth slot to be in at a particular moment in time (based on the Papagayo dialogue file).

Thanks

Comments

  • I agree.  Are you making use of AC's Speech Manager (for e.g. translations / speech audio) or is that all being handled separately?

    As AC characters can speak over each other, the best solution would probably be to create per-character functions that can be used to read a character's current line, and line ID (if generated by the Speech Manager).  eg:

    GetCurrentSpeechText ()
    GetCurrentSpeechID ()


    Would those suffice?
  • I will be using AC's Speech Manager.

    So are you suggesting that the script, which is assigned to the given character's gameobject, checks on every update whether there is current speech? Sounds good, but I was hoping for something a little more robust.

    Perhaps implementing events into AC would be a great thing as it could streamline the integration process. So for instance, when a character starts talking it broadcasts an event "CharacterSpeech" and then people can add their own listeners in to do whatever they want when that event occurs (i.e. affect Spine objects). And you can pass arguments through events too, which in this case would be the Character and the Dialogue ID or something.
  • Only making suggestions - I'd rather not make changes to it once released, so I want it to be correct first time.

    If the broadcast was made on the character's gameobject, you could presumably omit the Character as an argument.  Technically the ID number would be enough, as you can use the speech manager to extract the text itself.  So would this be more like it?
    BroadcastMessage ("StartSpeaking", 123);
  • Same here :) Just suggesting at this point.

    So I just had a look at the BroadcastMessage function - http://docs.unity3d.com/ScriptReference/Component.BroadcastMessage.html
    This looks like what you use for the action "Object: Send Message". I've been doing some reading and it seems to be quite an expensive tool. Reference: http://answers.unity3d.com/questions/185980/sendbroadcast-message-vs-delegates-and-events.html

    I think what we're talking about really relates to something much bigger, and it is a architectural decision that could make many different AC features more integration-friendly - not just a character speaking. As such, we could make it more global in nature.

    Chris, what do you think of utilising Events and Delegates?
    Say, have a separate EventManager class that resides in the GameEngine GameObject, and then call particular events when particular AC events occur. i.e. Character speaks. Inventory item is picked up. etc. etc. That way devs can just add their own custom code as an event listener. And you can of course have multiple event listeners, and dynamically add or remove listeners in code.

    1. EventManager class in GameEngine
    - could use this code, which allows for dynamic event names: http://wiki.unity3d.com/index.php?title=Advanced_CSharp_Messenger (note, it currently allows up to 3 parameters, but easy to add more if required)
    2. (example use-case) In Speech.cs, in the Speech method, we can trigger an event in EventManager class: EventManager.Broadcast("CharacterSpeaks_Player", 123)
    3. (example use-case) In a custom script on the player's gameobject, let's say a script called EventListeners.cs residing on the Player gameobject, put in the custom code for the event listener:
    void OnCharacterSpeaking(int DialogueID)
    {
      // get some extra info using the DialogueID
      // make the character's mouth move
    }
    4. (example use-case) On that same script EventListeners.cs, we need to register that event listener above:
    void Start()
    {
      EventManager.AddListener<int>("CharacterSpeaks_Player", OnCharacterSpeaking);
    }


    If I have a little play around can I share some of it with you privately to get your thoughts?
  • edited March 2016
    Okay, I've just set something up that is very simple, but works quite well. Let me know what you think.

    1. Add the Messenger.cs and Callback.cs scripts to the project's Static folder, as shown in this link: http://wiki.unity3d.com/index.php?title=Advanced_CSharp_Messenger
    - doesn't need to be added to any gameobject in the scene
    - please make changes based on this link for Unity 5.2 changes: http://spectragate.com/fixing-common-advanced-c-messenger-issues/

    2. In the Speech.cs file, add this code:
    Messenger.Broadcast<Char, int, string> ("CharacterSpeaks", _speaker, lineID, _language);

    3. Developers can add their custom event listeners where they choose. For instance, it could be a generic EventListener.cs script on a separate global gameobject. Or a script on the Player 'CustomPlayerController.cs'. And it would contain:
    void Start () {
      Messenger.AddListener<Char, int, string> ("CharacterSpeaks", OnCharacterSpeaks);
    }


    void OnCharacterSpeaks(Char character, int lineID, string language) {
     // do some awesome stuff here
    }
  • Also, due to the dynamic nature of the system and the fact that you can create an event name on the fly, you could even create an Action called "Object: Trigger Event" and allow the developer to enter in a custom string, which would be the event name, and maybe 1-3 custom parameters.

    Then the particular action would broadcast a message (Messenger.Broadcast), which could be listened to by some custom code. Could be interesting.
  • Going a bit beyond the simple change I was expecting, but I appreciate the depth of your response.

    AC already makes use of delegates, actually - allowing you to override the input system.  I'm keen for AC to adopt a more "generic" approach to integrations, rather than enforcing specific assets, so extending this would be OK in principle.  It does bring the issue of "where do you draw the line?" however.

    Being a somewhat larger issue than this first looked to be, I can't say if anything shall be included in v1.51, but I shall certainly have a look.
  • Yeah sorry I got a bit carried away haha.

    I get what you mean about "where do you draw the line?". It's always hard when so many different people will want to do so many different things, some not touching the code at all and some hacking away like crazy.

    I suppose that's why a simple, generic approach works so well, as people can use as much or as little as they want.

    I'm happy to explore it further with you if you want. Just let me know.
  • edited March 2016
    Hi Chris.
    So going back to a solution for the custom lip-syncing, without getting carried away with everything else...

    Another idea for custom lipsync integration would be to create a "Custom" LipSyncMode and devs can create custom lip-sync integration in a custom Animation Engine. Then in Speech.cs, after the RogoLipSyncIntegration conditional:
    else if (KickStarter.speechManager.lipSyncMode == LipSyncMode.Custom)
    {
    speaker.StartCustomLipSync(lineID, _language, _message);
    }
    and have the StartCustomLipSync function in Char.cs be:
    public void StartCustomLipSync (int lineID, string lang, string message)
    {
    animEngine.StartLipSync(int lineID, string lang, string message);
    }
    Then devs can do whatever they want in their own custom animation engine.
  • Yes, that could work - nice suggestion.  However, I'd like to keep the number of methods down to avoid confustion - have you tried the new custom events feature?  That allows you to trigger events when e.g. a character begins speaking (see this tutorial).  You should find that is enough to work with.
  • Love it. Thanks Chris. Nice new feature :)
Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Welcome to the official forum for Adventure Creator.