Architecture Overview
Open Chat Widget is a full-stack application built with modern technologies. This guide explains how each component works and how they communicate.
System Diagram
Components
Location : widget/src/index.ts
Technology : Vanilla TypeScript compiled to a single JavaScript bundle
Purpose : Embeddable UI that website visitors interact with
The widget is framework-agnostic and works on any website - WordPress, React, Vue, plain HTML, etc.
Key Features
Self-contained : Single chat-widget.js file with no dependencies
Configurable : Read options from data-* attributes on the script tag
Session management : Generates and stores unique session IDs in localStorage
Streaming support : Processes NDJSON responses token-by-token
Responsive design : Adapts to mobile and desktop screens
State persistence : Remembers open/closed state across page reloads
Configuration
The widget reads configuration from the script tag:
type WidgetConfig = {
apiUrl : string ; // Backend endpoint
apiKey : string ; // Widget API key
title : string ; // Header text
welcomeMessage : string ; // Initial message
inputPlaceholder : string ; // Input field placeholder
position : "left" | "right" ; // Screen position
accentColor : string ; // Primary color (hex)
};
Example from source code:
< script
src = "http://localhost:4000/widget/chat-widget.js"
data-api-url = "http://localhost:4000/chat"
data-api-key = "change-me-widget-key"
data-title = "Support"
data-welcome-message = "Hey! How can I help?"
data-position = "right"
data-accent-color = "#0ea5e9"
defer
></ script >
Build Process
The widget is built with esbuild for maximum performance:
widget/esbuild.config.mjs
esbuild . build ({
entryPoints: [ 'src/index.ts' ],
bundle: true ,
minify: true ,
target: 'es2020' ,
format: 'iife' , // Immediately-invoked function expression
outfile: 'dist/chat-widget.js'
});
2. Backend API (Server)
Location : backend/src/server.ts
Technology : Node.js with Express framework
Purpose : Process chat requests, manage OpenAI streaming, and enforce security
Core Responsibilities
Authentication Validates API keys using timing-safe comparison
Rate Limiting Tracks requests by IP address (30/min default)
CORS Protection Configurable origin allowlist
Widget Serving Delivers compiled JavaScript bundle
Request Flow
When a user sends a chat message:
Validation
// Validate request body
const chatRequestSchema = z . object ({
sessionId: z . string (). regex ( / ^ [ A-Za-z0-9._:- ] {1,128} $ / ),
message: z . string (). min ( 1 ). max ( 4000 )
});
Get or create conversation
// Query Convex for existing conversation or create new
const conversationId = await convex . mutation (
anyApi . conversations . getOrCreateConversation ,
{ sessionId , now: Date . now () }
);
Store user message
await convex . mutation ( anyApi . conversations . addMessage , {
conversationId ,
role: "user" ,
content: message ,
createdAt: Date . now ()
});
Fetch conversation history
const history = await convex . query (
anyApi . conversations . getHistoryForModel ,
{ conversationId , limit: env . MAX_HISTORY_MESSAGES }
);
Stream OpenAI response
const response = await fetch ( "https://api.openai.com/v1/chat/completions" , {
method: "POST" ,
headers: {
Authorization: `Bearer ${ env . OPENAI_API_KEY } ` ,
"Content-Type" : "application/json"
},
body: JSON . stringify ({
model: env . OPENAI_MODEL ,
stream: true ,
messages: [
{
role: "system" ,
content: "You are a concise and helpful AI assistant embedded in a support chat widget."
},
... history
]
})
});
Forward tokens to client
// Parse OpenAI's SSE stream and convert to NDJSON
writeStreamLine ( res , { type: "start" , conversationId });
// For each token from OpenAI:
writeStreamLine ( res , { type: "token" , token });
// When complete:
writeStreamLine ( res , { type: "done" , message: fullMessage , conversationId });
Persist assistant message
await convex . mutation ( anyApi . conversations . addMessage , {
conversationId ,
role: "assistant" ,
content: finalMessage ,
createdAt: Date . now ()
});
API Endpoints
The backend exposes these routes:
Method Path Purpose Auth GET/healthHealth check None GET/widget/chat-widget.jsWidget bundle None GET/v1/openapi.jsonOpenAPI spec None POST/chatStreaming chat (legacy) Widget API key POST/v1/chatNon-streaming chat Widget API key POST/v1/chat/streamStreaming chat Widget API key GET/v1/admin/conversationsList conversations Admin API key GET/v1/admin/conversations/:idGet conversation thread Admin API key
Security Features
Implemented per IP address in backend/src/server.ts:105-120: const rateLimitBuckets = new Map < string , RateLimitBucket >();
function isWithinRateLimit ( req : Request ) : boolean {
const key = getClientIp ( req );
const now = Date . now ();
const existing = rateLimitBuckets . get ( key );
if ( ! existing || now > existing . resetAt ) {
rateLimitBuckets . set ( key , {
count: 1 ,
resetAt: now + env . RATE_LIMIT_WINDOW_MS
});
return true ;
}
existing . count += 1 ;
return existing . count <= env . RATE_LIMIT_MAX_REQUESTS ;
}
Uses timing-safe comparison to prevent timing attacks: backend/src/server.ts:122-131
function secureEquals ( left : string , right : string ) : boolean {
const leftBuffer = Buffer . from ( left );
const rightBuffer = Buffer . from ( right );
if ( leftBuffer . length !== rightBuffer . length ) {
return false ;
}
return timingSafeEqual ( leftBuffer , rightBuffer );
}
Configurable origin allowlist with production safety: backend/src/server.ts:67-75
const configuredOrigins = env . CORS_ORIGIN . split ( "," )
. map (( origin ) => origin . trim ())
. filter ( Boolean );
const allowAllOrigins = configuredOrigins . includes ( "*" );
if ( env . NODE_ENV === "production" && allowAllOrigins ) {
throw new Error ( "Refusing to start with CORS_ORIGIN='*' in production." );
}
3. Convex Database
Location : convex/schema.ts, convex/conversations.ts
Technology : Convex (serverless real-time database)
Purpose : Store and query conversations and messages
Convex provides reactive queries, automatic scaling, and built-in TypeScript support.
Schema
Two tables with indexes for efficient queries:
export default defineSchema ({
conversations: defineTable ({
sessionId: v . string (),
createdAt: v . number (),
updatedAt: v . number (),
lastMessage: v . optional ( v . string ())
})
. index ( "by_session_id" , [ "sessionId" ])
. index ( "by_updated_at" , [ "updatedAt" ]) ,
messages: defineTable ({
conversationId: v . id ( "conversations" ),
role: v . union ( v . literal ( "user" ), v . literal ( "assistant" )),
content: v . string (),
createdAt: v . number ()
})
. index ( "by_conversation_id" , [ "conversationId" ])
. index ( "by_conversation_id_created_at" , [ "conversationId" , "createdAt" ])
}) ;
Key Functions
Get or Create Conversation
Add Message
List Conversations
// convex/conversations.ts:27-48
export const getOrCreateConversation = mutation ({
args: {
sessionId: v . string (),
now: v . number ()
},
handler : async ( ctx , args ) => {
const existing = await ctx . db
. query ( "conversations" )
. withIndex ( "by_session_id" , ( q ) => q . eq ( "sessionId" , args . sessionId ))
. unique ();
if ( existing ) {
return existing . _id ;
}
return await ctx . db . insert ( "conversations" , {
sessionId: args . sessionId ,
createdAt: args . now ,
updatedAt: args . now ,
lastMessage: ""
});
}
});
4. Admin Dashboard
Location : dashboard/
Technology : Next.js 15 with React Server Components
Purpose : View and manage conversations
Features
Password authentication : Simple login with DASHBOARD_PASSWORD
Conversation list : Sorted by most recent activity
Thread viewer : See full message history for each session
Responsive design : Works on desktop and mobile
Key Pages
Login Page
Conversation List
Thread Viewer
dashboard/app/login/page.tsx
// Password-protected login form
// Sets encrypted cookie on success
Location: /login export default async function DashboardPage () {
await requireAuth ();
const conversations = await listConversations ();
return (
< main >
< h1 > Conversations </ h1 >
{ conversations . map ( conversation => (
< Link href = { `/conversations/ ${ conversation . _id } ` } >
Session : { conversation . sessionId }
</ Link >
))}
</ main >
);
}
Location: / dashboard/app/conversations/[id]/page.tsx
// Fetch conversation and messages from Convex
// Display chronological message list
Location: /conversations/:id
Authentication Flow
import { cookies } from "next/headers" ;
export async function requireAuth () {
const cookieStore = await cookies ();
const authCookie = cookieStore . get ( "dashboard-auth" );
if ( ! authCookie ?. value ) {
redirect ( "/login" );
}
// Verify encrypted cookie matches password hash
}
Data Flow Example
Let’s trace a complete user interaction:
User opens chat widget
Widget loads from http://localhost:4000/widget/chat-widget.js
Creates or retrieves sessionId from localStorage: "a1b2c3d4-..."
Displays welcome message
User types message
User enters: “What are your hours?”
Widget POSTs to /chat:
{
"sessionId" : "a1b2c3d4-..." ,
"message" : "What are your hours?"
}
Backend processes request
Validates API key: ✓
Checks rate limit: ✓
Queries Convex: Find or create conversation for sessionId
Stores user message in Convex
Fetches last 20 messages for context
OpenAI generates response
Backend sends conversation history to OpenAI
OpenAI streams tokens: "Our", " support", " hours", …
Backend forwards each token as NDJSON:
{ "type" : "start" , "conversationId" : "j57..." }
{ "type" : "token" , "token" : "Our" }
{ "type" : "token" , "token" : " support" }
{ "type" : "done" , "message" : "Our support hours are 9am-5pm EST." , "conversationId" : "j57..." }
Widget displays response
Receives NDJSON stream
Shows typing indicator (“Thinking…”)
Appends each token in real-time
Displays final message: “Our support hours are 9am-5pm EST.”
Backend stores response
Saves assistant message to Convex
Updates conversation’s updatedAt timestamp
Updates lastMessage preview
Admin views conversation
Admin logs into dashboard
Sees conversation in list (sorted by updatedAt)
Clicks to view full thread:
User: “What are your hours?”
Assistant: “Our support hours are 9am-5pm EST.”
Tech Stack Summary
Frontend
Widget : Vanilla TypeScript → esbuild → single bundle
Dashboard : Next.js 15, React Server Components, Tailwind CSS
Backend
API Server : Node.js 20+, Express, TypeScript
Validation : Zod schemas
HTTP Client : Native fetch API
Database
Convex : Real-time serverless database
Tables : conversations, messages
Indexes : Optimized for session and timestamp queries
Provider : OpenAI
Model : GPT-4-turbo-mini (configurable)
Streaming : Server-sent events (SSE) → NDJSON
DevOps
Build : npm workspaces, concurrent dev mode
Deployment : Docker, Docker Compose
Hosting : Compatible with Railway, Render, Vercel, Fly.io
Widget Bundle Size ~15KB gzipped. No dependencies. Loads async with defer.
Streaming Responses Token-by-token delivery feels instant. First token typically arrives in under 500ms.
Convex Queries Indexed lookups are ~10-50ms. Reactive subscriptions update in real-time.
Rate Limiting In-memory buckets prevent abuse. No database queries for rate checks.
Deployment Architecture
Recommended setup for production:
┌─────────────────────┐
│ Users' Browsers │
└──────────┬──────────┘
│
▼
┌─────────────────────┐
│ Your Website(s) │ ← Embed widget script tag
│ (Any hosting) │
└──────────┬──────────┘
│
▼
┌─────────────────────┐
│ Backend API │ ← Railway / Render / Fly.io
│ (Node.js) │ Docker container
└──────────┬──────────┘
│
├─────────────────┐
▼ ▼
┌───────────┐ ┌──────────────┐
│ Convex │ │ OpenAI API │
│ (Database)│ │ (GPT-4) │
└───────────┘ └──────────────┘
┌─────────────────────┐
│ Admin Dashboard │ ← Vercel / Netlify
│ (Next.js) │ Reads from Convex
└─────────────────────┘
Each component scales independently. The widget is served as a static file. The backend is stateless and can run multiple instances behind a load balancer.
Next Steps
Deploy to Production Learn how to deploy each component to cloud providers
API Reference Complete endpoint documentation with examples
Customization Guide Customize widget appearance, colors, and positioning
Security Best Practices Harden your deployment and protect user data