Skip to main content

Overview

The TypeScript implementation provides a complete customer support agent with conversation history, tool integration, and knowledge base search capabilities.

Core Agent Function

chat()

The main agent function that processes user questions and manages tool calls.
question
string
required
The user’s question or message to process
messages
any[]
Complete conversation history including system prompt, user messages, assistant responses, and tool calls
output
string
The agent’s final response to the user
import { chat } from './agent';

const result = await chat("Do you have any copy paper?");
console.log(result.output);
// "Yes! We carry several types. Let me check what's available..."

Implementation Details

The chat function is wrapped with LangSmith’s traceable decorator for observability:
const chat = traceable(
  async (question: string): Promise<{ messages: any[]; output: string }> => {
    // Function implementation
  },
  { name: "Emma", metadata: { thread_id: threadId } }
);

Tool Call Loop

The agent implements an agentic loop that:
  1. Sends messages to the LLM with available tools
  2. Executes any tool calls requested by the model
  3. Sends tool results back to the LLM
  4. Repeats until the model provides a final response
while (responseMessage.tool_calls) {
  // Execute tool calls
  for (const toolCall of responseMessage.tool_calls) {
    const functionArgs = JSON.parse(toolCall.function.arguments);
    
    if (toolCall.function.name === "query_database") {
      result = await queryDatabase(functionArgs.query, dbPath);
    } else if (toolCall.function.name === "search_knowledge_base") {
      result = await searchKnowledgeBase(functionArgs.query);
    }
  }
  
  // Get next response
  response = await client.chat.completions.create({
    model: "gpt-5-nano",
    messages,
    tools,
    tool_choice: "auto",
  });
}

Thread Management

getThreadHistory()

Retrieves conversation history for a specific thread.
threadId
string
required
Unique identifier for the conversation thread
messages
any[]
Array of conversation messages for the thread
function getThreadHistory(threadId: string): any[] {
  return threadStore[threadId] || [];
}

saveThreadHistory()

Persists conversation history for a thread.
threadId
string
required
Unique identifier for the conversation thread
messages
any[]
required
Array of messages to save
function saveThreadHistory(threadId: string, messages: any[]): void {
  threadStore[threadId] = messages;
}

setThreadId()

Sets the active thread ID for the agent session.
id
string
required
Thread identifier to set as active
import { setThreadId } from './agent';

setThreadId("custom-thread-id");

Knowledge Base Management

loadKnowledgeBase()

Loads and indexes knowledge base documents with embeddings caching.
kbDir
string
default:"./knowledge_base"
Path to the knowledge base directory
import { loadKnowledgeBase } from './agent';

await loadKnowledgeBase("./knowledge_base");
// Knowledge base loaded from cache: 15 documents

Cache Strategy

The function implements intelligent caching:
  • Checks if embeddings are stale by comparing file modification times
  • Loads from cache if available and up-to-date
  • Regenerates embeddings only when documents change
if (embeddingsAreStale(kbPath, cachePath)) {
  console.log("Knowledge base documents changed, regenerating embeddings...");
  await generateAndCacheEmbeddings(kbPath, cachePath);
} else {
  const cacheData = JSON.parse(fs.readFileSync(cachePath, "utf-8"));
  knowledgeBaseDocs = cacheData.docs;
  knowledgeBaseEmbeddings = cacheData.embeddings;
}

Utility Functions

cosineSimilarity()

Calculates cosine similarity between two embedding vectors.
a
number[]
required
First embedding vector
b
number[]
required
Second embedding vector
similarity
number
Cosine similarity score between 0 and 1
function cosineSimilarity(a: number[], b: number[]): number {
  let dot = 0, normA = 0, normB = 0;
  for (let i = 0; i < a.length; i++) {
    dot += a[i] * b[i];
    normA += a[i] * a[i];
    normB += b[i] * b[i];
  }
  return dot / (Math.sqrt(normA) * Math.sqrt(normB));
}

Tool Definitions

The agent uses OpenAI function calling schemas to define available tools:

QUERY_DATABASE_TOOL

Tool schema for database queries:
const QUERY_DATABASE_TOOL = {
  type: "function" as const,
  function: {
    name: "query_database",
    description: "SQL query to get information about our inventory for customers like products, quantities and prices.",
    parameters: {
      type: "object",
      properties: {
        query: {
          type: "string",
          description: `SQL query to execute against the inventory database.

YOU DO NOT KNOW THE SCHEMA. ALWAYS discover it first:
1. Query 'SELECT name FROM sqlite_master WHERE type="table"' to see available tables
2. Use 'PRAGMA table_info(table_name)' to inspect columns for each table
3. Only after understanding the schema, construct your search queries`,
        },
      },
      required: ["query"],
    },
  },
};

SEARCH_KNOWLEDGE_BASE_TOOL

Tool schema for knowledge base search:
const SEARCH_KNOWLEDGE_BASE_TOOL = {
  type: "function" as const,
  function: {
    name: "search_knowledge_base",
    description: "Search company knowledge base for information about policies, procedures, company info, shipping, returns, ordering, contact information, store locations, and business hours.",
    parameters: {
      type: "object",
      properties: {
        query: {
          type: "string",
          description: "Natural language question or search query about company policies or information",
        },
      },
      required: ["query"],
    },
  },
};

Complete Example

import { chat, loadKnowledgeBase, setThreadId } from './agent';
import { uuid7 } from 'langsmith';

// Initialize
const threadId = uuid7();
setThreadId(threadId);
await loadKnowledgeBase('./knowledge_base');

// First interaction
const result1 = await chat("What types of paper do you have?");
console.log(result1.output);

// Follow-up (maintains conversation context)
const result2 = await chat("What's the price of the first option?");
console.log(result2.output);

Build docs developers (and LLMs) love