Skip to main content

Your First Agent

Let’s create a simple agent that estimates the size of objects. This example demonstrates the core Fast Agent pattern.
1

Create your project directory

Create a new directory for your Fast Agent project:
mkdir my-fast-agent
cd my-fast-agent
2

Create your agent file

Create a file called agent.py with the following code:
agent.py
import asyncio
from fast_agent import FastAgent

# Create the application
fast = FastAgent("Agent Example")

@fast.agent(
  instruction="Given an object, respond only with an estimate of its size."
)
async def main():
  async with fast.run() as agent:
    await agent.interactive()

if __name__ == "__main__":
    asyncio.run(main())
That’s it! This is a complete Fast Agent application.
3

Run your agent

Run your agent with uv:
uv run agent.py
This will start an interactive chat session with your agent. Try asking:
> the moon
> a basketball
> the Eiffel Tower
Use the --model flag to specify a different model:
uv run agent.py --model sonnet
uv run agent.py --model gpt-4.1
uv run agent.py --model o3-mini.low

Understanding the Code

Let’s break down what’s happening:

1. Create a FastAgent App

fast = FastAgent("Agent Example")
This creates a new Fast Agent application. The string is just a descriptive name.

2. Define an Agent

@fast.agent(
  instruction="Given an object, respond only with an estimate of its size."
)
The @fast.agent decorator defines an agent with a specific instruction. The instruction is the base prompt that guides the agent’s behavior.

3. Run the Agent

async with fast.run() as agent:
    await agent.interactive()
This starts the agent and opens an interactive chat session.

Sending Messages Programmatically

Instead of interactive mode, you can send messages directly:
agent.py
import asyncio
from fast_agent import FastAgent

fast = FastAgent("Agent Example")

@fast.agent(
  instruction="Given an object, respond only with an estimate of its size."
)
async def main():
  async with fast.run() as agent:
    moon_size = await agent("the moon")
    print(moon_size)

if __name__ == "__main__":
    asyncio.run(main())
Run it:
uv run agent.py

Using the Scaffold Command

Fast Agent provides a scaffold command to generate a complete agent template with configuration files:
fast-agent scaffold
This creates:
  • agent.py - A template agent file
  • fastagent.config.yaml - Configuration for models and MCP servers
  • fastagent.secrets.yaml.example - Example secrets file
  • .gitignore - Git ignore rules
  • pyproject.toml.tmpl - Project template
The scaffolded agent includes template variables like {{serverInstructions}}, {{agentSkills}}, and {{currentDate}} that get automatically populated at runtime.

Adding MCP Servers

Let’s create an agent that uses MCP servers to fetch web content and write social media posts.
1

Create the workflow file

Create social.py:
social.py
import asyncio
from fast_agent import FastAgent

fast = FastAgent("Social Media Writer")

@fast.agent(
    "url_fetcher",
    instruction="Given a URL, provide a complete and comprehensive summary",
    servers=["fetch"],  # MCP server defined in config
)
@fast.agent(
    "social_media",
    instruction="""
    Write a 280 character social media post for any given text. 
    Respond only with the post, never use hashtags.
    """,
)
@fast.chain(
    name="post_writer",
    sequence=["url_fetcher", "social_media"],
)
async def main():
    async with fast.run() as agent:
        await agent.post_writer.send("https://llmindset.co.uk")

if __name__ == "__main__":
    asyncio.run(main())
2

Create the configuration file

Create fastagent.config.yaml to define the MCP server:
fastagent.config.yaml
default_model: gpt-5-mini.low

mcp:
  targets:
    - name: fetch
      target: "uvx mcp-server-fetch"
3

Run the workflow

uv run social.py
Or from the command line:
uv run social.py --agent post_writer --message "https://example.com"

How It Works

  1. url_fetcher agent uses the fetch MCP server to retrieve and summarize the URL
  2. social_media agent takes the summary and writes a concise social post
  3. post_writer chain connects them in sequence
Add --quiet to disable progress display and return only the final response - useful for simple automations:
uv run social.py --agent post_writer --message "url" --quiet

Interactive Features

Fast Agent provides rich interactive features:

Switch Between Agents

During a chain workflow, you can switch agents by typing @agent-name:
> @url_fetcher
> @social_media

Use MCP Prompts

If your MCP server provides prompts, you can apply them interactively:
> /prompt prompt_name

Request Human Input

Agents can request human input when they need additional context:
@fast.agent(
    instruction="An AI agent that assists with basic tasks. Request Human Input when needed.",
    human_input=True,
)

Using Quickstart Templates

Fast Agent includes several quickstart templates for common patterns:

Workflow Example

Generate example workflows including chaining, parallel, router, and orchestrator:
fast-agent quickstart workflow
Run the examples:
uv run workflow/chaining.py
uv run workflow/parallel.py
uv run workflow/router.py

Researcher Agent

Create a researcher agent with evaluator-optimizer workflow:
fast-agent quickstart researcher
This demonstrates:
  • Web search with the fetch MCP server
  • Quality evaluation and iterative improvement
  • Multi-agent collaboration

Data Analysis Agent

Create a data analysis agent with filesystem access:
fast-agent quickstart data-analysis

Model Selection

Fast Agent supports multiple model providers with simple aliases:

Anthropic Models

uv run agent.py --model haiku
uv run agent.py --model sonnet
uv run agent.py --model opus

OpenAI Models

uv run agent.py --model gpt-4.1
uv run agent.py --model gpt-4.1-mini
uv run agent.py --model o1
uv run agent.py --model o3-mini.low
uv run agent.py --model o3-mini.high

Ollama (Local Models)

fast-agent go --model generic.qwen2.5

Configure Default Model

Set the default model in fastagent.config.yaml:
fastagent.config.yaml
default_model: gpt-5-mini.low

Quick Reference

Command Line

fast-agent go

Agent Syntax

# Simple agent
@fast.agent(
  instruction="You are a helpful agent"
)

# Named agent with MCP servers
@fast.agent(
  name="researcher",
  instruction="Research topics using web search",
  servers=["fetch"],
  model="sonnet",
)

# Chain workflow
@fast.chain(
  name="workflow",
  sequence=["agent1", "agent2"],
)

Calling Agents

# Interactive mode
await agent.interactive()

# Send a message
result = await agent("your message")

# Named agent
result = await agent.researcher("topic")
result = await agent.researcher.send("topic")

# Chain workflow
await agent.workflow.send("input")

Next Steps

Now that you’ve built your first agent, explore more advanced features:

Agent Workflows

Learn about chain, parallel, router, and orchestrator patterns

MCP Servers

Connect agents to tools and data sources via MCP

Configuration

Configure models, logging, and advanced options

Examples

Explore complete examples and use cases

Getting Help

If you need assistance:

Build docs developers (and LLMs) love