Skip to main content

Overview

The Microsoft Agent Framework provides multiple hosting options for deploying agents to production environments. Whether you need stateless HTTP endpoints, durable orchestrations, or agent-to-agent communication, the framework offers flexible deployment patterns.

Hosting Options

Azure Functions

Host agents as HTTP-accessible services using Azure Functions with built-in scaling, monitoring, and Azure integration. Best for:
  • Stateless request/response patterns
  • HTTP API endpoints
  • Serverless deployments
  • Azure ecosystem integration
Key Features:
  • Auto-generated HTTP endpoints per agent
  • Built-in health checks
  • Session management
  • Multi-agent hosting
Learn more about Azure Functions hosting →

DurableTask Integration

Leverage Durable Functions and DurableTask for long-running workflows, orchestrations, and stateful agent execution. Best for:
  • Long-running workflows
  • Multi-agent orchestrations
  • Human-in-the-loop patterns
  • Reliable state management
Key Features:
  • Durable orchestrations
  • Checkpoint and resume
  • Parallel agent execution
  • External event handling
Learn more about DurableTask →

A2A Protocol

Connect agents across systems using the Agent-to-Agent (A2A) protocol for standardized inter-agent communication. Best for:
  • Distributed agent systems
  • Cross-framework communication
  • Agent discovery
  • Standardized interfaces
Key Features:
  • Standard protocol compliance
  • Agent discovery via AgentCard
  • Streaming and non-streaming modes
  • Background task handling
Learn more about A2A Protocol →

Deployment Patterns

Single Agent Hosting

Deploy a single agent with dedicated endpoints:
from agent_framework.azure import AgentFunctionApp, AzureOpenAIChatClient
from azure.identity import AzureCliCredential

agent = AzureOpenAIChatClient(credential=AzureCliCredential()).as_agent(
    name="Assistant",
    instructions="You are a helpful assistant."
)

app = AgentFunctionApp(agents=[agent], enable_health_check=True)

Multi-Agent Hosting

Host multiple specialized agents in a single application:
weather_agent = client.as_agent(
    name="WeatherAgent",
    instructions="Provide weather information.",
    tools=[get_weather]
)

math_agent = client.as_agent(
    name="MathAgent",
    instructions="Help with calculations.",
    tools=[calculate_tip]
)

app = AgentFunctionApp(agents=[weather_agent, math_agent])

Orchestrated Workflows

Create complex workflows with multiple agents:
@app.orchestration_trigger(context_name="context")
def agent_workflow(context: DurableOrchestrationContext):
    """Sequential agent execution with shared session."""
    agent = app.get_agent(context, "WriterAgent")
    session = agent.create_session()
    
    # First pass
    initial = yield agent.run(
        messages="Write a sentence about AI.",
        session=session
    )
    
    # Second pass with context
    refined = yield agent.run(
        messages=f"Improve this: {initial.text}",
        session=session
    )
    
    return refined.text

Configuration

Environment Variables

All hosting patterns require Azure OpenAI configuration:
export AZURE_OPENAI_ENDPOINT="https://your-resource.openai.azure.com/"
export AZURE_OPENAI_DEPLOYMENT_NAME="gpt-4o-mini"
# Optional: Use API key instead of Azure CLI authentication
export AZURE_OPENAI_API_KEY="your-api-key"

Azure Functions Configuration

For Azure Functions, configure local.settings.json:
{
  "IsEncrypted": false,
  "Values": {
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "AzureWebJobsStorage": "UseDevelopmentStorage=true",
    "DURABLE_TASK_SCHEDULER_CONNECTION_STRING": "Endpoint=http://localhost:8080;TaskHub=default;Authentication=None",
    "TASKHUB_NAME": "default",
    "AZURE_OPENAI_ENDPOINT": "<AZURE_OPENAI_ENDPOINT>",
    "AZURE_OPENAI_CHAT_DEPLOYMENT_NAME": "<AZURE_OPENAI_CHAT_DEPLOYMENT_NAME>"
  }
}

Prerequisites

Development Tools

  • Azure Functions Core Tools: Install v4.x
  • Azure CLI: For authentication (az login)
  • Docker: For local emulators (Azurite, DTS)

Azure Services

  • Azure OpenAI Service: Deploy a chat model (gpt-4o-mini or better)
  • Azure Storage: Required for Azure Functions state
  • Durable Task Scheduler (optional): For orchestrations

Local Emulators

Start Azurite for Azure Storage emulation:
docker run -d --name storage-emulator -p 10000:10000 -p 10001:10001 -p 10002:10002 mcr.microsoft.com/azure-storage/azurite
Start Durable Task Scheduler emulator:
docker run -d --name dts-emulator -p 8080:8080 -p 8082:8082 mcr.microsoft.com/dts/dts-emulator:latest
The DTS dashboard will be available at http://localhost:8082.

Authentication

The framework supports multiple authentication methods:

Azure CLI (Development)

az login
from azure.identity import AzureCliCredential

credential = AzureCliCredential()
client = AzureOpenAIChatClient(credential=credential)

API Key (Testing)

import os

client = AzureOpenAIChatClient(
    api_key=os.environ["AZURE_OPENAI_API_KEY"]
)

Managed Identity (Production)

from azure.identity import ManagedIdentityCredential

credential = ManagedIdentityCredential()
client = AzureOpenAIChatClient(credential=credential)
Production Considerations: DefaultAzureCredential is convenient for development but may cause latency in production due to credential probing. Use specific credentials like ManagedIdentityCredential for production deployments.

Testing Deployed Agents

Test HTTP endpoints using curl, REST clients, or HTTP files:
# Test single agent
curl -X POST http://localhost:7071/api/agents/Assistant/run \
  -H "Content-Type: text/plain" \
  -d "What is machine learning?"

# Test with JSON
curl -X POST http://localhost:7071/api/agents/Assistant/run \
  -H "Content-Type: application/json" \
  -d '{"message": "Explain AI", "session_id": "user-123"}'

# Test orchestration
curl -X POST http://localhost:7071/api/orchestration/run

Monitoring and Observability

Azure Application Insights

Enable Application Insights for production monitoring:
{
  "Values": {
    "APPLICATIONINSIGHTS_CONNECTION_STRING": "InstrumentationKey=..."
  }
}

Logging

Configure logging levels in host.json:
{
  "version": "2.0",
  "logging": {
    "logLevel": {
      "default": "Information",
      "Microsoft.Agents": "Debug"
    }
  }
}

DTS Dashboard

Monitor orchestrations at http://localhost:8082 (local) or your DTS endpoint (Azure).

Next Steps

Azure Functions

Deploy agents as serverless functions

DurableTask

Build long-running workflows

A2A Protocol

Connect distributed agents

Build docs developers (and LLMs) love