Skip to main content

Overview

The FnCallAgent is a widely applicable function calling agent that integrates LLM capabilities with tool usage. It’s the foundation for other agents like Assistant and ReActChat, providing core function calling and tool execution functionality.

Key Features

Native Function Calling

Built-in support for OpenAI-compatible function calling

Parallel Execution

Execute multiple tools simultaneously for efficiency

Memory Management

Automatic file and conversation context management

Iterative Tool Use

Multi-step reasoning with repeated tool calls

Constructor

from qwen_agent.agents import FnCallAgent

agent = FnCallAgent(
    function_list=['code_interpreter', 'web_search'],
    llm={'model': 'qwen-max-latest', 'model_type': 'qwen_dashscope'},
    system_message='You are a helpful assistant.',
    name='MyAgent',
    description='A function calling agent',
    files=['./context.txt']
)

Parameters

function_list
list
List of tools to enable. Can be:
  • Tool names: 'code_interpreter'
  • Tool configs: {'name': 'code_interpreter', 'timeout': 30}
  • Tool objects: CodeInterpreter()
llm
dict | BaseChatModel
LLM configuration with model, model_type, and optionally api_key, model_server, generate_cfg.
system_message
str
default:"You are a helpful assistant"
System message defining the agent’s role and behavior.
name
str
Agent identifier for multi-agent scenarios.
description
str
Agent description used for routing in multi-agent systems.
files
list
Initial files to load into the agent’s memory for context.

Basic Usage

1

Initialize the Agent

from qwen_agent.agents import FnCallAgent

agent = FnCallAgent(
    function_list=['code_interpreter'],
    llm={
        'model': 'qwen-max-latest',
        'model_type': 'qwen_dashscope',
        'api_key': 'YOUR_API_KEY'
    }
)
2

Run the Agent

messages = [{
    'role': 'user',
    'content': 'Calculate the factorial of 10'
}]

for response in agent.run(messages=messages):
    print(response)

Function Calling Example

from qwen_agent.agents import FnCallAgent
from qwen_agent.tools.base import BaseTool, register_tool
import json5

# Define a custom tool
@register_tool('get_weather')
class WeatherTool(BaseTool):
    description = 'Get current weather for a location'
    parameters = [{
        'name': 'location',
        'type': 'string',
        'description': 'City name',
        'required': True
    }]
    
    def call(self, params: str, **kwargs) -> str:
        location = json5.loads(params)['location']
        # Call weather API
        return json5.dumps({'temperature': 72, 'condition': 'sunny'})

# Create agent with the tool
agent = FnCallAgent(
    function_list=['get_weather'],
    llm={'model': 'qwen-max-latest', 'model_type': 'qwen_dashscope'}
)

messages = [{'role': 'user', 'content': 'What\'s the weather in San Francisco?'}]

for response in agent.run(messages=messages):
    print(response)

Parallel Function Calling

The agent automatically handles parallel tool execution when appropriate:
from qwen_agent.agents import FnCallAgent

agent = FnCallAgent(
    function_list=['web_search', 'code_interpreter'],
    llm={'model': 'qwen-max-latest', 'model_type': 'qwen_dashscope'}
)

# This query may trigger multiple tools in parallel
messages = [{
    'role': 'user',
    'content': 'Search for Python statistics libraries and calculate their GitHub stars'
}]

for response in agent.run(messages=messages):
    print(response)

Multi-turn Conversations

messages = []

# First turn
messages.append({'role': 'user', 'content': 'Calculate 5 factorial'})
for response in agent.run(messages=messages):
    pass
messages.extend(response)

# Second turn - agent remembers context
messages.append({'role': 'user', 'content': 'Now multiply that by 3'})
for response in agent.run(messages=messages):
    pass
messages.extend(response)

print(messages[-1]['content'])  # Final answer

With File Context

from qwen_agent.agents import FnCallAgent

# Agent can access file content for context
agent = FnCallAgent(
    function_list=['code_interpreter'],
    llm={'model': 'qwen-max-latest', 'model_type': 'qwen_dashscope'},
    files=['./data.csv', './config.json']
)

messages = [{
    'role': 'user',
    'content': 'Analyze the data.csv file and create a visualization'
}]

for response in agent.run(messages=messages):
    print(response)

Advanced: Custom Tool with State

from qwen_agent.tools.base import BaseTool
import json5

class StatefulTool(BaseTool):
    name = 'counter'
    description = 'A counter that tracks state across calls'
    parameters = [{
        'name': 'operation',
        'type': 'string',
        'description': 'increment or get',
        'required': True
    }]
    
    def __init__(self, cfg=None):
        super().__init__(cfg)
        self.count = 0
    
    def call(self, params: str, **kwargs) -> str:
        operation = json5.loads(params)['operation']
        if operation == 'increment':
            self.count += 1
        return json5.dumps({'count': self.count})

# Use the stateful tool
tool = StatefulTool()
agent = FnCallAgent(
    function_list=[tool],
    llm={'model': 'qwen-max-latest', 'model_type': 'qwen_dashscope'}
)

Best Practices

  • Only include tools relevant to your use case
  • Provide clear, specific tool descriptions
  • Use descriptive parameter names and types
  • Test tools individually before combining them
  • Tools should return error messages in JSON format
  • Handle timeouts gracefully with appropriate timeout values
  • Log tool execution for debugging
  • Validate tool parameters before execution
  • Leverage parallel execution for independent operations
  • Set appropriate MAX_LLM_CALL_PER_RUN limit (default: 10)
  • Use streaming for better user experience
  • Cache expensive tool results when possible

Comparison with Other Agents

FeatureFnCallAgentAssistantReActChat
Function Calling
RAG Support
FormatOpenAIOpenAIReAct Text
Parallel Tools
Best ForTool usageRAG + ToolsReasoning traces

Function Calling

Learn about function calling concepts

Custom Tools

Create custom tools for your agent

Assistant Agent

Use Assistant for RAG + function calling

API Reference

Complete API documentation

Build docs developers (and LLMs) love