Discussions
Migration from Heygen Interactive Avatar to Heygen LiveAvatar
last month by Techo company
Hello our application, send sentence by sentence to Heygen (because we have an LLM and we want to reduce starting time to the user)
Now it is built around the Heygen Interactive Avatar Async Streaming Task (we store each sentence with task_id in our local state), and then we can use heygen events to know if this is sentence being spoken or being stopped or interrupted using the task_id in the events.
In the LiveAvatar, I was not able to catch a unique task_id that identify a "sentence" or "task". Also when interrupting, it did not send me the task_ids that were interrupted. Even using from the sdk the "AgentEventsEnum.AVATAR_TRANSCRIPTION," this will fire at the end of the sentence speech, not at start.
So my Questions are the following:
- Using Liveavatar can we identify tasks (with task_id) like Interactive Avatar
- Is there a reliable way to detect when Interruption started and ended? and which sentences were interrupted?
- how to deetect current spoken sentence, as soon the avatar start speaking (similar to the previous events in Interactive Avatar?