Skip to main content

Overview

BaseFnCallModel extends BaseChatModel with function calling capabilities using prompt-based approaches.

Class Signature

from qwen_agent.llm.function_calling import BaseFnCallModel

class BaseFnCallModel(BaseChatModel, ABC):
    def __init__(self, cfg: Optional[Dict] = None)

Configuration

fncall_prompt_type
str
default:"nous"
Function calling prompt format: ‘qwen’ or ‘nous’

Function Calling Formats

Qwen Format

llm = get_chat_model({
    'model': 'qwen-max',
    'model_type': 'qwen_dashscope',
    'generate_cfg': {
        'fncall_prompt_type': 'qwen'
    }
})

Nous Format (Default)

llm = get_chat_model({
    'model': 'custom-model',
    'model_type': 'oai',
    'generate_cfg': {
        'fncall_prompt_type': 'nous'
    }
})

Usage Example

from qwen_agent.llm import get_chat_model
from qwen_agent.llm.schema import Message

llm = get_chat_model({
    'model': 'qwen-max',
    'model_type': 'qwen_dashscope'
})

functions = [{
    'name': 'calculator',
    'description': 'Perform calculations',
    'parameters': {
        'type': 'object',
        'properties': {
            'expression': {'type': 'string'}
        },
        'required': ['expression']
    }
}]

messages = [Message(role='user', content='What is 2+2?')]

for response in llm.chat(messages=messages, functions=functions):
    if response[-1].function_call:
        print(f"Function: {response[-1].function_call.name}")
        print(f"Args: {response[-1].function_call.arguments}")

Parameters

parallel_function_calls
bool
default:"False"
Allow multiple function calls in one response
function_choice
str
default:"auto"
Force function calling: ‘auto’, ‘none’, or function name

See Also

Build docs developers (and LLMs) love