Overview
BaseFnCallModel extends BaseChatModel with function calling capabilities using prompt-based approaches.
Class Signature
from qwen_agent.llm.function_calling import BaseFnCallModel
class BaseFnCallModel(BaseChatModel, ABC):
def __init__(self, cfg: Optional[Dict] = None)
Configuration
Function calling prompt format: ‘qwen’ or ‘nous’
llm = get_chat_model({
'model': 'qwen-max',
'model_type': 'qwen_dashscope',
'generate_cfg': {
'fncall_prompt_type': 'qwen'
}
})
llm = get_chat_model({
'model': 'custom-model',
'model_type': 'oai',
'generate_cfg': {
'fncall_prompt_type': 'nous'
}
})
Usage Example
from qwen_agent.llm import get_chat_model
from qwen_agent.llm.schema import Message
llm = get_chat_model({
'model': 'qwen-max',
'model_type': 'qwen_dashscope'
})
functions = [{
'name': 'calculator',
'description': 'Perform calculations',
'parameters': {
'type': 'object',
'properties': {
'expression': {'type': 'string'}
},
'required': ['expression']
}
}]
messages = [Message(role='user', content='What is 2+2?')]
for response in llm.chat(messages=messages, functions=functions):
if response[-1].function_call:
print(f"Function: {response[-1].function_call.name}")
print(f"Args: {response[-1].function_call.arguments}")
Parameters
Allow multiple function calls in one response
Force function calling: ‘auto’, ‘none’, or function name
See Also