Skip to main content

Overview

Support for OpenAI models and OpenAI-compatible API endpoints.

Class Signature

from qwen_agent.llm import get_chat_model

llm = get_chat_model({
    'model': 'gpt-4o-mini',
    'model_type': 'oai',
    'api_key': 'your-openai-api-key'
})

Configuration

model
str
required
Model name: ‘gpt-4o’, ‘gpt-4o-mini’, ‘gpt-4-turbo’, etc.
api_key
str
required
OpenAI API key (or set OPENAI_API_KEY env var)
model_type
str
required
Must be ‘oai’
base_url
str
Custom API endpoint for OpenAI-compatible servers

Usage Example

from qwen_agent.llm import get_chat_model
from qwen_agent.llm.schema import Message

# OpenAI
llm = get_chat_model({
    'model': 'gpt-4o-mini',
    'model_type': 'oai',
    'api_key': 'sk-...'
})

# Or OpenAI-compatible endpoint
llm = get_chat_model({
    'model': 'custom-model',
    'model_type': 'oai',
    'base_url': 'https://api.example.com/v1',
    'api_key': 'your-key'
})

messages = [Message(role='user', content='Hello!')]
for response in llm.chat(messages=messages):
    print(response[-1].content)

See Also

Build docs developers (and LLMs) love