Intelligent conversation orchestrators that combine AI with deterministic business logic
Agents are the brain of your conversational applications. They orchestrate the conversation by combining natural language understanding with deterministic execution, personality configuration, and business context awareness.
Basic identity that appears in dashboards and routing:
public class BusinessAppAgentGeneral{ public string Emoji { get; set; } = "🤖"; [MultiLanguageProperty] public Dictionary<string, string> Name { get; set; } [MultiLanguageProperty] public Dictionary<string, string> Description { get; set; }}
Example configuration:
Emoji: 🏥
Name (English): “Medical Appointment Assistant”
Name (Arabic): “مساعد المواعيد الطبية”
Description: “Helps patients schedule, reschedule, and confirm medical appointments”
public class BusinessAppAgentPersonality{ [MultiLanguageProperty] public Dictionary<string, string> Name { get; set; } [MultiLanguageProperty] public Dictionary<string, string> Role { get; set; } [MultiLanguageProperty] public Dictionary<string, List<string>> Capabilities { get; set; } [MultiLanguageProperty] public Dictionary<string, List<string>> Ethics { get; set; } [MultiLanguageProperty] public Dictionary<string, List<string>> Tone { get; set; }}
Example configuration:
Name: "Dr. Sarah"Role: "Professional medical appointment coordinator"Capabilities: - Schedule new appointments - Reschedule existing appointments - Send appointment reminders - Answer questions about office locationsEthics: - Never share patient information - Always verify identity before rescheduling - Escalate medical questions to actual doctorsTone: - Professional and warm - Patient and understanding - Clear and concise - Empathetic to patient concerns
These personality traits are automatically injected into the AI’s system prompt, shaping how it generates responses.
Controls how the agent handles turn-taking in real-time conversations:
public class BusinessAppAgentInterruption{ public BusinessAppAgentInterruptionTurnEnd TurnEnd { get; set; } public bool UseTurnByTurnMode { get; set; } = false; public bool? IncludeInterruptedSpeechInTurnByTurnMode { get; set; } public BusinessAppAgentInterruptionPauseTrigger? PauseTrigger { get; set; } public BusinessAppAgentInterruptionVerification? Verification { get; set; }}
Turn-taking modes:
VAD (Voice Activity Detection): Standard silence detection
ML-based projection: Predicts when user is finished speaking
LLM-based decision: AI distinguishes between pauses and actual turn ends
This is critical for natural voice conversations where detecting “uh-huh” vs. a true interruption matters.
For voice agents handling sensitive information, use LLM-based turn detection to better understand when users are pausing to think vs. actually finished speaking.
Connects the agent to vector databases for Retrieval Augmented Generation (RAG):
public class BusinessAppAgentKnowledgeBase{ public bool Enabled { get; set; } public AgentKnowledgeBaseSearchStrategyTypeENUM SearchStrategy { get; set; } public List<string> CollectionIds { get; set; } public int MaxResults { get; set; } public double SimilarityThreshold { get; set; }}
When enabled, the agent automatically searches your knowledge base and includes relevant information in its responses.
Transfer scripts: Conversation flows from other agents
This allows complex scenarios like:
1. Agent starts with "Greeting" script2. User asks to schedule appointment → Agent loads "Scheduling" script3. User asks about billing → Agent transfers to "Billing Agent"
When transferring between agents, ensure the receiving agent has access to necessary context variables. Use the ScriptTransferToAgentNodeReferences to track dependencies.
Be explicit about what the agent can and cannot do:
Capabilities: - Schedule appointments - Answer hours and location questions - Send confirmation emailsEthics: - Never provide medical advice - Never cancel appointments without confirmation - Always offer to transfer to a human for complex issues