Skip to main content

Build your first agent

This guide will help you create and run your first AI agent using Microsoft Agent Framework. You’ll have a working agent in just a few minutes.

Prerequisites

Before you begin, make sure you have:
1

Python 3.10+ or .NET 10.0+

2

Azure CLI installed and logged in

Install from Azure CLI docs and run:
az login
3

Azure AI Foundry project

Create a project at ai.azure.com with an OpenAI deployment (e.g., gpt-4o)

Set up your environment

1

Install the package

pip install agent-framework --pre
The --pre flag is required during the preview period.
2

Set environment variables

export AZURE_AI_PROJECT_ENDPOINT="https://your-project-endpoint"
export AZURE_OPENAI_RESPONSES_DEPLOYMENT_NAME="gpt-4o"

Create your first agent

Create a simple agent that can answer questions:
# hello_agent.py
import asyncio
import os

from agent_framework.azure import AzureOpenAIResponsesClient
from azure.identity import AzureCliCredential
from dotenv import load_dotenv

# Load environment variables
load_dotenv()

async def main():
    # Create the client with Azure CLI authentication
    credential = AzureCliCredential()
    client = AzureOpenAIResponsesClient(
        project_endpoint=os.environ["AZURE_AI_PROJECT_ENDPOINT"],
        deployment_name=os.environ["AZURE_OPENAI_RESPONSES_DEPLOYMENT_NAME"],
        credential=credential,
    )

    # Create an agent with instructions
    agent = client.as_agent(
        name="HelloAgent",
        instructions="You are a friendly assistant. Keep your answers brief.",
    )

    # Run the agent
    result = await agent.run("What is the capital of France?")
    print(f"Agent: {result}")

if __name__ == "__main__":
    asyncio.run(main())

Run your agent

python hello_agent.py
You should see a response like:
Agent: The capital of France is Paris.

Add streaming support

Get real-time responses as the agent generates them:
# Streaming: receive tokens as they are generated
print("Agent (streaming): ", end="", flush=True)
async for chunk in agent.run("Tell me a one-sentence fun fact.", stream=True):
    if chunk.text:
        print(chunk.text, end="", flush=True)
print()

Add tools to your agent

Give your agent capabilities by adding function tools:
from agent_framework import tool
from typing import Annotated
from pydantic import Field
from random import randint

@tool(approval_mode="never_require")
def get_weather(
    location: Annotated[str, Field(description="The location to get the weather for.")],
) -> str:
    """Get the weather for a given location."""
    conditions = ["sunny", "cloudy", "rainy", "stormy"]
    return f"The weather in {location} is {conditions[randint(0, 3)]} with a high of {randint(10, 30)}°C."

# Create agent with the tool
agent = client.as_agent(
    name="WeatherAgent",
    instructions="You are a helpful weather agent. Use the get_weather tool to answer questions.",
    tools=get_weather,
)

result = await agent.run("What's the weather like in Seattle?")
print(f"Agent: {result}")
In production, use approval_mode="always_require" (Python) to get user confirmation before executing tools.

Multi-turn conversations

Maintain conversation context across multiple interactions:
# Create a session to maintain conversation history
session = agent.create_session()

# First turn
result = await agent.run("My name is Alice and I love hiking.", session=session)
print(f"Agent: {result}\n")

# Second turn — the agent remembers the context
result = await agent.run("What do you remember about me?", session=session)
print(f"Agent: {result}")

Next steps

Now that you have a working agent, explore more capabilities:

Installation Guide

Detailed setup instructions and configuration options

Core Concepts

Learn about agents, tools, sessions, and workflows

Add Memory

Give your agent long-term memory with context providers

Build Workflows

Chain multiple agents and executors together
Production consideration: DefaultAzureCredential (in .NET) is convenient for development but may cause latency in production. Consider using ManagedIdentityCredential or a specific credential type for production deployments.

Build docs developers (and LLMs) love