Skip to main content
A demonstration of using OpenAI’s Agents SDK with Nebius Token Factory’s API to create an AI assistant with custom tools. This example shows how to integrate external APIs and create function-based tools.

Features

  • Custom AI assistant using Nebius’s LLMs via OpenAI SDK
  • Email sending capability using Resend API
  • Async/await pattern for efficient execution
  • Custom model provider implementation
  • Function-based tool creation

Prerequisites

Installation

1

Clone the repository

git clone https://github.com/Arindam200/awesome-ai-apps.git
cd starter_ai_agents/openai_agents_sdk
2

Install dependencies

pip install -r requirements.txt
3

Configure environment

Create a .env file with required API keys:
NEBIUS_API_KEY=your_nebius_api_key_here
RESEND_API_KEY=your_resend_api_key_here
EXAMPLE_BASE_URL=https://api.tokenfactory.nebius.com/v1
EXAMPLE_MODEL_NAME=meta-llama/Meta-Llama-3.1-8B-Instruct

Implementation

Custom Model Provider

Create a custom model provider to use Nebius AI with OpenAI SDK:
main.py
from openai import AsyncOpenAI
from agents import (
    Agent,
    Model,
    ModelProvider,
    OpenAIChatCompletionsModel,
    RunConfig,
    Runner,
    function_tool,
    set_tracing_disabled,
)
import os

# Initialize OpenAI client with Nebius endpoint
api_key = os.getenv("NEBIUS_API_KEY")
base_url = os.getenv("EXAMPLE_BASE_URL", "https://api.tokenfactory.nebius.com/v1")
model_name = os.getenv("EXAMPLE_MODEL_NAME", "meta-llama/Meta-Llama-3.1-8B-Instruct")

client = AsyncOpenAI(base_url=base_url, api_key=api_key)
set_tracing_disabled(disabled=True)

class CustomModelProvider(ModelProvider):
    def get_model(self, model_name: str | None) -> Model:
        """
        Returns an OpenAI chat completions model instance configured with the specified model name.
        
        Args:
            model_name: The name of the model to use, or None to use the default.
        
        Returns:
            An OpenAIChatCompletionsModel initialized with the given model name and OpenAI client.
        """
        return OpenAIChatCompletionsModel(model=model_name, openai_client=client)

CUSTOM_MODEL_PROVIDER = CustomModelProvider()

Email Tool Implementation

Create a function tool for sending emails:
main.py
import resend
from agents import function_tool

resend.api_key = os.getenv("RESEND_API_KEY")

@function_tool
def send_email(to: str, subject: str, body: str):
    """
    Sends an email using the Resend API.
    
    Args:
        to: Recipient email address.
        subject: Subject line of the email.
        body: HTML content of the email.
    
    Returns:
        A dictionary with status "success" and the message ID if sent, 
        or status "error" and an error message if sending fails.
    """
    print(f"Sending email to {to}")
    params = {
        "from": "[email protected]",  # Replace with your verified sender email
        "to": [to],
        "subject": subject,
        "html": body,
    }
    try:
        response = resend.Emails.send(params)
        return {"status": "success", "message_id": response.get("id")}
    except Exception as e:
        return {"status": "error", "message": str(e)}

Agent Execution

main.py
async def main():
    """
    Runs an example agent that sends an email using a haiku response style.
    """
    agent = Agent(
        name="Assistant", 
        instructions="You only respond in haikus.", 
        tools=[send_email]
    )

    result = await Runner.run(
        agent,
        "Send an email to [email protected] with the subject 'Test Email' and the body 'Demo for the Video'",
        run_config=RunConfig(model_provider=CUSTOM_MODEL_PROVIDER),
    )
    print(result.final_output)

if __name__ == "__main__":
    asyncio.run(main())

Usage

Run the agent:
python main.py
The agent will:
  1. Create an assistant that responds in haikus
  2. Use the email tool to send a test email
  3. Output the result

Technical Details

Architecture

OpenAI SDK

Agent framework with async support

Nebius AI

LLM provider via OpenAI-compatible API

Resend API

Email sending service integration

Function Tools

Declarative tool definitions

Model Configuration

The agent supports any model from Nebius Token Factory. Configure via environment variables:
  • EXAMPLE_BASE_URL: API endpoint
  • EXAMPLE_MODEL_NAME: Model identifier

Customization

Change Assistant Instructions

agent = Agent(
    name="Assistant",
    instructions="You are a helpful assistant that provides detailed explanations.",
    tools=[send_email]
)

Add More Tools

@function_tool
def fetch_data(url: str):
    """Fetch data from a URL"""
    # Implementation
    return data

agent = Agent(
    name="Assistant",
    instructions="Your instructions here",
    tools=[send_email, fetch_data]
)

Use Different Models

Modify the environment variable or pass directly:
EXAMPLE_MODEL_NAME=Qwen/Qwen3-30B-A3B

Environment Variables

VariableDescriptionDefault
NEBIUS_API_KEYNebius API keyRequired
RESEND_API_KEYResend API keyRequired
EXAMPLE_BASE_URLNebius API endpointhttps://api.tokenfactory.nebius.com/v1
EXAMPLE_MODEL_NAMEModel to usemeta-llama/Meta-Llama-3.1-8B-Instruct

Next Steps

MCP Integration

Connect to external services via MCP

Multi-Agent Systems

Build complex agent workflows

Build docs developers (and LLMs) love