Documentation Index
Fetch the complete documentation index at: https://docs.liveavatar.com/llms.txt
Use this file to discover all available pages before exploring further.
What you’ll build
Some of LiveAvatar’s avatars are rendered with solid green backgrounds. By detecting and removing those green pixels on the client, you can:- Make the background transparent so the avatar sits on top of your own UI.
- Replace the background with an image (branded backdrop, product shot, office scene).
- Replace the background with a video (looping environment, ambient footage).
How it works
The avatar video element exposes a normalMediaStream with a green background. You render that stream to a canvas, and for every frame:
- Convert each pixel to HSV and check whether its hue falls inside a green range.
- Replace qualifying pixels — either with full transparency, or with a pixel from your custom background.
- Draw the result to the visible canvas.
If you’re using the default embed iframe, you won’t have direct access to the underlying video element because it’s in a cross-origin frame. Use the Web SDK or a LITE Mode integration (LiveKit, Agora, Pipecat) instead — those give you the raw
MediaStream.Training a custom avatar? Record against a specific solid background color (green is standard, but any saturated, uniform color works) and use the same chroma key technique below to swap it out — adjust
minHue / maxHue to match the color you trained on.Run the reference demo
liveavatar-web-sdk · apps/bg-removal-demo
Complete Next.js reference app — chroma key toggle, solid color, preset images, custom image/video URL, voice chat.
- Toggling the chroma key on/off
- Swapping to a solid color
- Preset image backgrounds
- A custom image or video URL pasted at runtime
- Voice chat with the avatar while the backdrop changes
Initialize
Clone repo, install deps, configure env:Start the dev server
Run from monorepo root:Environment variables
| Variable | Description |
|---|---|
API_KEY | LiveAvatar API key (server-side only) |
API_URL | LiveAvatar API base URL |
NEXT_PUBLIC_API_URL | Same as API_URL (read by the SDK in the browser) |
DEFAULT_AVATAR_ID | Default avatar ID for the start form |
DEFAULT_VOICE_ID | Default voice ID |
DEFAULT_CONTEXT_ID | Default context ID |
DEFAULT_LANGUAGE | Default language code (e.g. en) |
apps/bg-removal-demo/README.md for full setup details, project structure, and the lib/chromaKey.ts source the production code is derived from.
Understanding the details
Here’s how the reference demo does it. Three steps, each with a clear job:- Mount the DOM surfaces — a
<video>element for the raw avatar stream, a<canvas>for the keyed output, and a toggle to switch between them. - Run the chroma key per frame — read pixels off the video, zero out alpha on green ones, write back to the canvas.
- Wire the toggle to swap layers — show the canvas (with effect) or the raw video (without), and stop the loop when the session ends.
<video> element playing the LiveAvatar stream via the Web SDK or a LITE Mode integration.
Step 1: Update your HTML
Add a canvas next to the avatar video, plus a checkbox to toggle the effect.Step 2: Create the chroma key module
Createsrc/chromaKey.ts with the per-frame keying logic.
Step 3: Wire up the toggle
In the code that owns your avatar video element, importsetupChromaKey and switch between the canvas and the raw video based on the checkbox state.
Add an image or video background
Transparency works when you want the avatar to float over your existing page. For a custom backdrop — a branded scene, an office environment, a product shot — composite a background layer behind the canvas in the DOM. No changes to the chroma key module are required. The canvas output is already transparent where the green used to be, so anything behind it shows through.Image background
Video background
Swap backgrounds at runtime
Because the background is a sibling DOM element, swapping it is a one-liner. No session restart, no chroma key reconfiguration.Make the avatar fully transparent
For overlaying the avatar on your own page — a floating assistant in the corner of a product UI, an in-app tutor, a kiosk overlay — skip the background entirely and let the page behind the canvas show through.Tuning the chroma key
The chroma key parameters can be adjusted to fine-tune the effect:minHue/maxHue— range of green hues to detect.60–180covers most greens. Narrow it (e.g.,90–150) if the avatar’s clothing is being partially keyed out.minSaturation— minimum saturation for detection. Avoids treating unsaturated grays and whites as green. Default0.1is usually fine.threshold— how much “greener” a pixel must be vs. its red/blue components. Higher is stricter. Lower it toward0.8if you see a green halo; raise it toward1.2if avatar pixels are becoming transparent.
Troubleshooting
Green halo around the avatar’s edges. The soft-edge falloff is controlled by thegreenness * 4 multiplier in applyChromaKey. Increase it (try * 6 or * 8) to make the fade-to-transparent more aggressive, which eats further into the halo. You can also tighten maxHue to 160 so only the purest greens are keyed.
Parts of the avatar are becoming transparent. The avatar is likely wearing something close to the key color (green tie, green shirt). Raise threshold to 1.2 or 1.3 so only pixels where green strongly dominates red and blue are removed. If the avatar is wearing green by design, pick a different avatar — chroma keying a green-on-green subject isn’t solvable client-side.
Edges flicker frame-to-frame. Each frame is keyed independently, so small boundary flicker is expected. To reduce it, render the canvas at the same resolution as the source video (don’t upscale) and avoid CSS filters like filter: blur() on the canvas.
Performance is poor on mobile or at 1080p. The getImageData → per-pixel loop → putImageData path is CPU-bound and can struggle above 720p on low-end devices. Two options:
- Render at a lower resolution by setting the canvas width/height to
640×360or854×480— the avatar still looks sharp because the source stream scales gracefully. - Use WebGL — a fragment-shader implementation runs on the GPU and handles 1080p at 60 FPS comfortably. Bigger lift, but worth it for high-resolution or mobile-first deployments.
Common pitfalls
The canvas is blank but the video is playing.applyChromaKey guards against this with a readyState < 2 check, but don’t call setupChromaKey before the video has a srcObject. Wait for your stream-ready event (e.g., the Web SDK’s STREAM_READY) before starting processing.
The embed iframe doesn’t expose the video element. The default embed is designed for drop-in simplicity and doesn’t expose the raw video stream. For chroma keying, switch to the Web SDK or a LITE Mode integration where you control the rendering layer.
Chroma key keeps running after the session ends. setupChromaKey returns a cleanup function — call it whenever the stream disconnects or the session terminates. Otherwise the animation frame loop keeps running against a stale video element and wastes CPU.