Skip to main content
PromptSmith provides seamless integration with the Vercel AI SDK through the toAiSdk() method, which exports both system prompts and tools in a single, ready-to-use configuration object.

Installation

1

Install dependencies

Install PromptSmith, Zod, and the Vercel AI SDK:
npm install promptsmith-ts zod ai
2

Install AI provider SDK

Choose your AI provider and install the corresponding SDK:
# OpenAI
npm install @ai-sdk/openai

# Anthropic
npm install @ai-sdk/anthropic

# Google
npm install @ai-sdk/google

Basic Integration

The simplest way to use PromptSmith with Vercel AI SDK is through the toAiSdk() method, which returns a configuration object that can be spread directly into AI SDK functions.
import { createPromptBuilder } from "promptsmith-ts/builder";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";

const agent = createPromptBuilder()
  .withIdentity("You are a helpful travel assistant")
  .withCapabilities([
    "Recommend destinations based on user preferences",
    "Plan detailed itineraries",
    "Provide travel tips and advice",
  ])
  .withTone("Enthusiastic, knowledgeable, and helpful");

const { text } = await generateText({
  model: openai("gpt-4"),
  ...agent.toAiSdk(), // Spreads { system, tools }
  prompt: "I want to visit Japan for 2 weeks. What should I see?",
});

console.log(text);

Integration with Tools

PromptSmith automatically converts tool definitions to Vercel AI SDK format. Define tools once with full type safety, and they’re ready to use.
import { createPromptBuilder } from "promptsmith-ts/builder";
import { generateText } from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

// Mock weather API function
async function fetchWeather(location: string, units: string) {
  // In production, call actual weather API
  return {
    location,
    temperature: units === "celsius" ? "22°C" : "72°F",
    conditions: "Partly cloudy",
    humidity: "65%",
  };
}

const weatherAgent = createPromptBuilder()
  .withIdentity("You are a weather information assistant")
  .withCapabilities([
    "Provide current weather conditions",
    "Answer weather-related questions",
  ])
  .withTool({
    name: "get_weather",
    description: "Get current weather for a location",
    schema: z.object({
      location: z.string().describe("City name or coordinates"),
      units: z.enum(["celsius", "fahrenheit"]).default("celsius"),
    }),
    execute: async ({ location, units }) => {
      return await fetchWeather(location, units);
    },
  })
  .withConstraint("must", "Always use the weather tool for current conditions")
  .withConstraint(
    "must_not",
    "Provide weather information without checking the tool"
  )
  .withTone("Friendly and informative");

const { text } = await generateText({
  model: openai("gpt-4"),
  ...weatherAgent.toAiSdk(),
  prompt: "What's the weather like in Tokyo?",
});

console.log(text);

Streaming Responses

Use streamText for real-time streaming responses:
import { streamText } from "ai";
import { createPromptBuilder } from "promptsmith-ts/builder";
import { openai } from "@ai-sdk/openai";

const chatAgent = createPromptBuilder()
  .withIdentity("You are a helpful coding assistant")
  .withCapabilities(["Write code", "Debug issues", "Explain concepts"])
  .withTone("Patient and encouraging");

const { textStream } = await streamText({
  model: openai("gpt-4"),
  ...chatAgent.toAiSdk(),
  prompt: "Explain how React hooks work",
});

for await (const chunk of textStream) {
  process.stdout.write(chunk);
}

Advanced: Multi-Step Tool Usage

The Vercel AI SDK supports multi-step tool execution. Here’s an example with multiple tools that work together:
import { createPromptBuilder } from "promptsmith-ts/builder";
import { generateText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { z } from "zod";

const researchAgent = createPromptBuilder()
  .withIdentity("You are a research assistant")
  .withCapabilities([
    "Search academic papers",
    "Summarize research",
    "Save papers to library",
  ])
  .withTool({
    name: "search_papers",
    description: "Search academic papers by topic",
    schema: z.object({
      topic: z.string().describe("Research topic"),
      limit: z.number().default(5).describe("Max results"),
    }),
    execute: async ({ topic, limit }) => {
      // Mock implementation
      return [
        { id: "paper1", title: "Research on " + topic, relevance: 0.95 },
        { id: "paper2", title: "Study of " + topic, relevance: 0.88 },
      ];
    },
  })
  .withTool({
    name: "summarize_paper",
    description: "Generate summary of a research paper",
    schema: z.object({
      paperId: z.string().describe("Paper ID to summarize"),
    }),
    execute: async ({ paperId }) => {
      return `Summary of ${paperId}: This paper explores...`;
    },
  })
  .withTool({
    name: "save_to_library",
    description: "Save paper to user's library",
    schema: z.object({
      paperId: z.string().describe("Paper ID"),
      tags: z.array(z.string()).describe("Tags for organization"),
    }),
    execute: async ({ paperId, tags }) => {
      return `Saved ${paperId} with tags: ${tags.join(", ")}`;
    },
  })
  .withExamples([
    {
      user: "Find papers on quantum computing and save the most relevant one",
      assistant:
        "I'll search for papers, summarize the top result, and save it for you.",
      explanation: "Demonstrates multi-step tool usage",
    },
  ]);

const { text } = await generateText({
  model: anthropic("claude-3-5-sonnet-20241022"),
  ...researchAgent.toAiSdk(),
  prompt: "Find papers on machine learning in healthcare and save the top 2",
  maxSteps: 10, // Allow multiple tool calls
});

console.log(text);

Next.js Integration

Use PromptSmith in Next.js API routes for production applications:
app/api/chat/route.ts
import { createPromptBuilder } from "promptsmith-ts/builder";
import { streamText } from "ai";
import { openai } from "@ai-sdk/openai";
import { z } from "zod";

const agent = createPromptBuilder()
  .withIdentity("Customer service assistant for TechStore")
  .withCapabilities(["Search products", "Track orders", "Process returns"])
  .withTool({
    name: "search_products",
    description: "Search product catalog",
    schema: z.object({
      query: z.string().describe("Search query"),
      category: z.string().optional().describe("Product category"),
    }),
    execute: async ({ query, category }) => {
      // Query your database
      return await db.products.search({ query, category });
    },
  })
  .withGuardrails()
  .withTone("Friendly and professional");

export async function POST(req: Request) {
  const { messages } = await req.json();

  const result = await streamText({
    model: openai("gpt-4-turbo"),
    ...agent.toAiSdk(),
    messages,
  });

  return result.toDataStreamResponse();
}

Type Safety

PromptSmith provides full type inference for tool parameters:
import { createPromptBuilder } from "promptsmith-ts/builder";
import { z } from "zod";

const builder = createPromptBuilder().withTool({
  name: "calculate",
  description: "Perform calculations",
  schema: z.object({
    operation: z.enum(["add", "subtract", "multiply", "divide"]),
    a: z.number(),
    b: z.number(),
  }),
  execute: async ({ operation, a, b }) => {
    // TypeScript knows:
    // - operation is "add" | "subtract" | "multiply" | "divide"
    // - a is number
    // - b is number
    switch (operation) {
      case "add":
        return a + b;
      case "subtract":
        return a - b;
      case "multiply":
        return a * b;
      case "divide":
        return b !== 0 ? a / b : "Cannot divide by zero";
    }
  },
});

What’s Exported

The toAiSdk() method returns an object with two properties:
type AiSdkConfig = {
  // The complete system prompt as a string
  system: string;

  // Tools in Vercel AI SDK format
  tools: Record<
    string,
    {
      description: string;
      parameters: z.ZodType;
      execute?: (args: unknown) => Promise<unknown> | unknown;
    }
  >;
};

Alternative: Export Separately

You can also export the system prompt and tools separately:
import { createPromptBuilder } from "promptsmith-ts/builder";

const agent = createPromptBuilder()
  .withIdentity("Assistant")
  .withTool({ name: "search", description: "Search", schema: z.object({}) });

// Export system prompt only
const systemPrompt = agent.build();

// Export tools only
const tools = agent.toAiSdkTools();

// Use separately
const response = await generateText({
  model: openai("gpt-4"),
  system: systemPrompt,
  tools: tools,
  prompt: "Hello",
});

Best Practices

Always use Zod schemas with .describe() for full type safety and automatic documentation:
schema: z.object({
  query: z.string().describe("The search query"),
  limit: z.number().optional().describe("Max results to return"),
})
Always enable guardrails for production agents:
const agent = createPromptBuilder()
  .withIdentity("Assistant")
  .withGuardrails() // Protects against prompt injection
  .withForbiddenTopics(["Sensitive topic 1", "Sensitive topic 2"]);
Provide examples to guide the model’s behavior:
builder.withExamples([
  {
    user: "What's the weather?",
    assistant: "I'll check the weather for you. Where are you located?",
    explanation: "Shows how to ask for missing information",
  },
]);
Use the built-in testing framework:
import { createTester } from "promptsmith-ts/tester";
import { openai } from "@ai-sdk/openai";

const tester = createTester();
const results = await tester.test({
  prompt: agent,
  provider: openai("gpt-4"),
  testCases: [
    {
      query: "Hello!",
      expectedBehavior: "Respond with a friendly greeting",
    },
  ],
});

console.log(`Score: ${results.overallScore}/100`);

Next Steps

Mastra Integration

Learn how to use PromptSmith with Mastra

Other Frameworks

Integrate with other AI frameworks

Build docs developers (and LLMs) love