Design the conversation. Across every channel.
ClearChannel by Vestara
Enterprise conversational AI is never one channel. IVR, chatbot, and agent assist run at once with shared customers and compliance needs but different output constraints. Most designers optimize one channel. ClearChannel shows all three from the same utterance so cross-channel tradeoffs and failures (for example bereavement routed like a balance inquiry) are visible and auditable.
The proof point
When the bereavement utterance fires, the system suppresses account verification, routes to a senior specialist, and the entire application turns purple. The UI does not just label sentiment. It inhabits it.
That is not decoration. That is the proof that emotional state handling was designed in, not added on.
The problem
The stakes are not abstract
sample utterances covering emotional edge cases
ClearChannel product scope
sentiment states driving the full CSS token system
Design system
classified intents with priority override rules
NLU architecture
Process
How it was built
STEP 01 — THE BRIEF
The brief and constraint set
An enterprise conversational channels team needed a designer who understood IVR, chatbot, and agent assist as a system. I read that requirement as a product spec and built the tool that would make a hiring team say she already understands our system. The lab opens empty: no pre-seeded result, two paths — open a sample or start a Live Call. The first screen is the brief.
Pivots
What changed and why
The lab originally opened with a pre-loaded analysis result. A busy first screen read as a data dump on mobile and bypassed the product story. The decision was to start empty with two paths: open a sample or start a Live Call. Analysis only appears after the user acts.
We chose clarity over looking live on load. The first screen is the brief.
Analysis originally waited for one full JSON response before rendering. The thesis is one utterance, many channels simultaneously. A blocking response hides that structure. Moving to SSE with progressive section extraction makes the parallel outputs visible as they arrive.
Align implementation with the product argument. The same reasoning surfaces across all channels, not as a sequential reveal, but as one response unfolding in real time.
IVR audio originally used AudioContext and decoded buffers. On iOS Safari, user-gesture chains break across await, causing silent failures on tap-to-play. The pattern was changed to fetch to Blob URL to HTMLAudioElement.play() with cleanup on end and unmount.
IVR is audible-first. Mobile playback is a product requirement, not a nice-to-have.
The original design used a small badge to label sentiment state. If only a badge changes color, reviewers miss the point that emotional state changes routing, copy, and system behavior. Semantic CSS variables driven by data-sentiment now retint background, topbar, accents, panels, and prosody indicators with smooth transitions.
The UI does not just label sentiment. It inhabits it.
The welcome experience could not be scrolled to completion on small screens and several controls did not meet 44px touch targets. Rebuilt with backdrop scrolling (not a trapped modal), 100dvh, 44px touch targets throughout, and readable label sizes on small screens.
Portfolio demos are shipped software. Onboarding and thumb-sized UI are part of the proof.
Live Call originally lived in an easy-to-miss location. Typed samples show NLU design. Live voice shows contact-center reality. Promoted to the empty-state hero as a primary CTA alongside sample utterances, with stronger topbar control and a header layout that does not break on narrow widths.
We run two proofs in one lab: designed utterances for edge cases, and live speech for what a call feels like.
What shipped
Every layer, production-ready
Sample coverage
IRA to brokerage fund transfer. Unauthorized transaction / fraud detection. Balance inquiry (baseline). Retirement planning. Bereavement / beneficiary update (death of spouse). Market anxiety / panic-selling. Repeat caller frustration. Barge-in escalation. Vague distress. Cognitive accessibility (family member managing account). Time pressure / urgent deadline.
Channel outputs
IVR: spoken script with prosody annotations, entities, routing, fallback. Chatbot: bot response, quick replies, containment decision, handoff context. Agent Assist: suggested script, policy references, compliance flags, escalation path.
NLU architecture
18 intents with priority override rules. Confidence score and threshold visualization. Entity schema. Training phrase suggestions. Collapsible four-column grid.
Sentiment theming
Five states: neutral (teal #0891B2), concerned (amber #D97706), distressed (purple #7C3AED), urgent (red #DC2626), confused (blue #3B82F6). data-sentiment attribute cascades through all CSS token surfaces. Smooth transitions on topbar, background, accents, borders, pills, prosody indicators.
Override rules
Bereavement: verification suppressed, senior specialist routing, distressed theme. Fraud escalation: urgent theme. Market anxiety behavioral coaching guardrail: concerned theme. Barge-in interruption detection.
Voice and audio
MediaRecorder plus OpenAI Whisper (/api/transcribe) for voice input. OpenAI TTS (/api/speak) plus Blob URL plus HTMLAudioElement for IVR playback. OpenAI Realtime (/api/realtime-session) for persistent Live Call mode. SSE streaming with progressive panel fill.
Design artifact page
/design-artifact: static page, no API calls. Intent taxonomy: all 18 intents with category, threshold, sentiment state. Override priority rules with structural changes and channel behavior. Entity schema with intent associations. Channel routing matrix. Sentiment state map with design token swatches from globals.css. Data from lib/designArtifactData.ts.
Infrastructure
Next.js 16, React 19, TypeScript, Tailwind CSS. Claude API (claude-sonnet-4-6), structured JSON, SSE streaming. OpenAI Whisper, TTS, Realtime. Vercel.
What this demonstrates
For every audience
IVR, Chatbot, and Agent Assist as a unified architecture from one utterance, not three separate demos.
Emotional state-driven UI theming across five states from a single root data-sentiment attribute.
18 intents, entity schema, confidence threshold, priority override rules surfaced as practitioner-readable evidence.
MediaRecorder, OpenAI Whisper, TTS audio playback, OpenAI Realtime Live Call.
Progressive fill aligned to the product thesis: parallel outputs visible as they stream.
Design artifact page and lib/designArtifactData.ts as a second deliverable.
Each pivot documents a real constraint, decision, and tradeoff.
Acknowledges what is and is not implemented: simulated NLU via Claude, parse failure UX gap, no Dialogflow or LUIS integration.
The honest summary
Three ways to understand this work
For engineers
For product
For design
The bereavement utterance fires. The system suppresses account verification, routes to a senior specialist, and the entire application turns purple. That is not decoration. That is the proof that emotional state handling was designed in, not added on.