Skip to main content

Text Generation

Genkit provides a simple, unified interface for generating text across all supported AI models. Use the generate() or generateText() functions to create AI-powered content.

Basic Usage

Generate text with a simple prompt:
import { genkit } from 'genkit';
import { googleAI } from '@genkit-ai/google-genai';

const ai = genkit({ plugins: [googleAI()] });

const { text } = await ai.generate({
  model: googleAI.model('gemini-2.5-flash'),
  prompt: 'Why is Firebase awesome?'
});

console.log(text);

Configuration Options

Basic Configuration

Control generation behavior with configuration options:
const { text } = await ai.generate({
  model: googleAI.model('gemini-2.5-flash'),
  prompt: 'Explain quantum computing',
  config: {
    temperature: 0.7,
    maxOutputTokens: 1000,
    topP: 0.9,
  }
});

System Messages

Set the AI’s behavior and personality with system messages:
const { text } = await ai.generate({
  model: googleAI.model('gemini-2.5-flash'),
  system: 'You are a helpful chef assistant. Keep answers concise.',
  prompt: 'How do I make pasta carbonara?'
});

Multi-Turn Conversations

Build conversations by passing message history:
const messages = [
  { role: 'user', content: [{ text: 'What is the capital of France?' }] },
  { role: 'model', content: [{ text: 'The capital of France is Paris.' }] },
  { role: 'user', content: [{ text: 'What is its population?' }] }
];

const { text } = await ai.generate({
  model: googleAI.model('gemini-2.5-flash'),
  messages
});

Using Flows

Wrap your generation logic in flows for better observability and deployment:
import { z } from 'genkit';

const jokeFlow = ai.defineFlow(
  {
    name: 'tellJoke',
    inputSchema: z.string(),
    outputSchema: z.string(),
  },
  async (topic) => {
    const { text } = await ai.generate({
      model: googleAI.model('gemini-2.5-flash'),
      prompt: `Tell me a joke about ${topic}`
    });
    return text;
  }
);

const joke = await jokeFlow('programming');

Model Providers

Genkit supports multiple AI providers with a unified interface:

Google AI (Gemini)

import { googleAI } from '@genkit-ai/google-genai';

const ai = genkit({ plugins: [googleAI()] });

const { text } = await ai.generate({
  model: googleAI.model('gemini-2.5-flash'),
  prompt: 'Hello!'
});

Anthropic (Claude)

import "github.com/firebase/genkit/go/plugins/anthropic"

g := genkit.Init(ctx, genkit.WithPlugins(&anthropic.Anthropic{}))

text, _ := genkit.GenerateText(ctx, g,
    ai.WithModelName("anthropic/claude-3-5-sonnet"),
    ai.WithPrompt("Hello!"),
)

Ollama (Local Models)

import "github.com/firebase/genkit/go/plugins/ollama"

g := genkit.Init(ctx, genkit.WithPlugins(&ollama.Ollama{
    ServerAddress: "http://localhost:11434",
}))

text, _ := genkit.GenerateText(ctx, g,
    ai.WithModelName("ollama/llama3.1"),
    ai.WithPrompt("Hello!"),
)

Multiple Providers

g := genkit.Init(ctx, genkit.WithPlugins(
    &googlegenai.GoogleAI{},
    &anthropic.Anthropic{},
))

// Use Gemini
geminiText, _ := genkit.GenerateText(ctx, g,
    ai.WithModelName("googleai/gemini-2.5-flash"),
    ai.WithPrompt("Hello!"),
)

// Use Claude
claudeText, _ := genkit.GenerateText(ctx, g,
    ai.WithModelName("anthropic/claude-3-5-sonnet"),
    ai.WithPrompt("Hello!"),
)

Next Steps

Build docs developers (and LLMs) love