A visual choreographer for AI agents

The signals are the music. The themes are the dancers. sajou is the choreographer.

Signal

Raw data. Task dispatches, tool calls, errors, MIDI events. The pulse of your agents.

Choreographer

Declarative sequences. JSON that describes what happens visually. Composable by humans and AIs.

Stage

The render. Three.js, shaders, particles, lights. Same data, different scene, different experience.

Signal panel showing live agent connections and streaming event log with colored type badges

Live signal stream from multiple sources — every agent event captured and typed.

Three.js stage canvas with placed entities, lights, particles, and editor tools

Visual stage — entities, lights, particles on a Three.js canvas.

Choreography step chain with trigger blocks and color-coded action sequences

Declarative choreographies — drag, connect, compose.

Shaders are first-class citizens.

WebGL fragment shaders run natively in the stage. Perlin noise, wave fields, fractal patterns — real-time visual computing, not pre-rendered mockups.

Agent monitoring

Watch your AI agents work like watching a Starcraft match — not reading logs.

Creative coding

MIDI controllers, shaders, particles. sajou is a visual instrument.

AI-composable scenes

Tell an AI: make me a My Little Pony theme. sajou plays it.

Watch your agents dance.