Skip to main content

Design the conversation. Across every channel.

ClearChannel by Vestara

Enterprise conversational AI is never one channel. IVR, chatbot, and agent assist run at once with shared customers and compliance needs but different output constraints. Most designers optimize one channel. ClearChannel shows all three from the same utterance so cross-channel tradeoffs and failures (for example bereavement routed like a balance inquiry) are visible and auditable.

Conversation design, NLU architecture, product design, full-stack buildMarch 2026
CONVERSATION-DESIGNNLU-ARCHITECTUREENTERPRISE-FINTECHPORTFOLIO-ARTIFACT
Status: Live

The proof point

When the bereavement utterance fires, the system suppresses account verification, routes to a senior specialist, and the entire application turns purple. The UI does not just label sentiment. It inhabits it.

That is not decoration. That is the proof that emotional state handling was designed in, not added on.

The problem

The stakes are not abstract

0

sample utterances covering emotional edge cases

ClearChannel product scope

0

sentiment states driving the full CSS token system

Design system

0

classified intents with priority override rules

NLU architecture

Live demo: ClearChannel by Vestara
Open in new tab

Process

How it was built

STEP 01 — THE BRIEF

The brief and constraint set

An enterprise conversational channels team needed a designer who understood IVR, chatbot, and agent assist as a system. I read that requirement as a product spec and built the tool that would make a hiring team say she already understands our system. The lab opens empty: no pre-seeded result, two paths — open a sample or start a Live Call. The first screen is the brief.

Pivots

What changed and why

The lab originally opened with a pre-loaded analysis result. A busy first screen read as a data dump on mobile and bypassed the product story. The decision was to start empty with two paths: open a sample or start a Live Call. Analysis only appears after the user acts.

We chose clarity over looking live on load. The first screen is the brief.

Analysis originally waited for one full JSON response before rendering. The thesis is one utterance, many channels simultaneously. A blocking response hides that structure. Moving to SSE with progressive section extraction makes the parallel outputs visible as they arrive.

Align implementation with the product argument. The same reasoning surfaces across all channels, not as a sequential reveal, but as one response unfolding in real time.

IVR audio originally used AudioContext and decoded buffers. On iOS Safari, user-gesture chains break across await, causing silent failures on tap-to-play. The pattern was changed to fetch to Blob URL to HTMLAudioElement.play() with cleanup on end and unmount.

IVR is audible-first. Mobile playback is a product requirement, not a nice-to-have.

The original design used a small badge to label sentiment state. If only a badge changes color, reviewers miss the point that emotional state changes routing, copy, and system behavior. Semantic CSS variables driven by data-sentiment now retint background, topbar, accents, panels, and prosody indicators with smooth transitions.

The UI does not just label sentiment. It inhabits it.

The welcome experience could not be scrolled to completion on small screens and several controls did not meet 44px touch targets. Rebuilt with backdrop scrolling (not a trapped modal), 100dvh, 44px touch targets throughout, and readable label sizes on small screens.

Portfolio demos are shipped software. Onboarding and thumb-sized UI are part of the proof.

Live Call originally lived in an easy-to-miss location. Typed samples show NLU design. Live voice shows contact-center reality. Promoted to the empty-state hero as a primary CTA alongside sample utterances, with stronger topbar control and a header layout that does not break on narrow widths.

We run two proofs in one lab: designed utterances for edge cases, and live speech for what a call feels like.

What shipped

Every layer, production-ready

Sample coverage

IRA to brokerage fund transfer. Unauthorized transaction / fraud detection. Balance inquiry (baseline). Retirement planning. Bereavement / beneficiary update (death of spouse). Market anxiety / panic-selling. Repeat caller frustration. Barge-in escalation. Vague distress. Cognitive accessibility (family member managing account). Time pressure / urgent deadline.

Channel outputs

IVR: spoken script with prosody annotations, entities, routing, fallback. Chatbot: bot response, quick replies, containment decision, handoff context. Agent Assist: suggested script, policy references, compliance flags, escalation path.

NLU architecture

18 intents with priority override rules. Confidence score and threshold visualization. Entity schema. Training phrase suggestions. Collapsible four-column grid.

Sentiment theming

Five states: neutral (teal #0891B2), concerned (amber #D97706), distressed (purple #7C3AED), urgent (red #DC2626), confused (blue #3B82F6). data-sentiment attribute cascades through all CSS token surfaces. Smooth transitions on topbar, background, accents, borders, pills, prosody indicators.

Override rules

Bereavement: verification suppressed, senior specialist routing, distressed theme. Fraud escalation: urgent theme. Market anxiety behavioral coaching guardrail: concerned theme. Barge-in interruption detection.

Voice and audio

MediaRecorder plus OpenAI Whisper (/api/transcribe) for voice input. OpenAI TTS (/api/speak) plus Blob URL plus HTMLAudioElement for IVR playback. OpenAI Realtime (/api/realtime-session) for persistent Live Call mode. SSE streaming with progressive panel fill.

Design artifact page

/design-artifact: static page, no API calls. Intent taxonomy: all 18 intents with category, threshold, sentiment state. Override priority rules with structural changes and channel behavior. Entity schema with intent associations. Channel routing matrix. Sentiment state map with design token swatches from globals.css. Data from lib/designArtifactData.ts.

Infrastructure

Next.js 16, React 19, TypeScript, Tailwind CSS. Claude API (claude-sonnet-4-6), structured JSON, SSE streaming. OpenAI Whisper, TTS, Realtime. Vercel.

Claude API (claude-sonnet-4-6, SSE streaming, progressive panel fill)OpenAI (Whisper · TTS · Realtime)data-sentiment CSS token architecture (five emotional state themes)Next.js 16React 19TypeScriptTailwind CSSVercelServer-Sent Events

What this demonstrates

For every audience

IVR, Chatbot, and Agent Assist as a unified architecture from one utterance, not three separate demos.

Emotional state-driven UI theming across five states from a single root data-sentiment attribute.

18 intents, entity schema, confidence threshold, priority override rules surfaced as practitioner-readable evidence.

MediaRecorder, OpenAI Whisper, TTS audio playback, OpenAI Realtime Live Call.

Progressive fill aligned to the product thesis: parallel outputs visible as they stream.

Design artifact page and lib/designArtifactData.ts as a second deliverable.

Each pivot documents a real constraint, decision, and tradeoff.

Acknowledges what is and is not implemented: simulated NLU via Claude, parse failure UX gap, no Dialogflow or LUIS integration.

The honest summary

Three ways to understand this work

TECHNICAL UNDERSTANDING

For engineers

The Claude API call (claude-sonnet-4-6) streams over SSE. The client accumulates the text stream and extracts complete JSON sections as braces close, allowing progressive panel fill without blocking. There is no server-side JSON repair pass. A final JSON.parse on the accumulated stream fails silently with no user-facing error state. That is the honest current implementation and it is in the status matrix. Voice uses MediaRecorder and OpenAI Whisper server-side via /api/transcribe, not browser Web Speech. IVR audio uses OpenAI TTS via /api/speak with a Blob URL and HTMLAudioElement specifically for iOS Safari reliability. OpenAI Realtime manages a persistent WebSocket session for Live Call mode, a distinct architecture from the standard transcribe-then-analyze path.
PRODUCT UNDERSTANDING

For product

This project is built from a product brief: an enterprise conversational channels team that needed a designer who understood IVR, chatbot, and agent assist as a unified system. The six pivot stories each represent a real product decision with a real tradeoff: empty state, SSE streaming, iOS audio path, full-environment sentiment theming, mobile craft, and Live Call promotion. The design artifact page at /design-artifact is a second deliverable produced from the same build, demonstrating that this designer thinks about conversation architecture as a documentation problem, not just an implementation problem.
DESIGN UNDERSTANDING

For design

The CSS token architecture is the design centerpiece. Five named sentiment states, each driving a complete color system through a single data-sentiment attribute on the root. IBM Plex Sans for interface text, IBM Plex Mono for financial data and classification output. Navy #1B2E4B, teal #0891B2 as the base palette. The panel layout hierarchy reflects practitioner reading order: IVR at 44%, Chatbot and Agent Assist stacked on the right, NLU collapsible below. Mobile is a drawer, not a collapsed state. 44px touch targets, 100dvh, backdrop scrolling.
The bereavement utterance fires. The system suppresses account verification, routes to a senior specialist, and the entire application turns purple. That is not decoration. That is the proof that emotional state handling was designed in, not added on.
Next Project