This guide will help you create your first MoFA agent. By the end, you’ll have a working LLM-powered agent that can answer questions and perform tasks.
Prerequisites: You need Rust 1.85 or newer. If you don’t have Rust installed, see the Installation guide.
Let’s run a simple chat example to verify everything works:
cd examplescargo run -p chat_stream
You should see output like:
======================================== MoFA LLM Agent========================================Agent loaded: LLM AgentAgent ID: llm-agent-...--- Chat Demo ---Q: Hello! What can you help me with?A: I'm an AI assistant that can help you with...
If you see this output, congratulations! Your MoFA installation is working correctly.
Every MoFA agent needs an ID, name, capabilities, and state. The LLMClient handles communication with the LLM provider.
2
Implement MoFAAgent trait
#[async_trait::async_trait]impl MoFAAgent for LLMAgent { fn id(&self) -> &str { &self.id } // ... other methods}
The MoFAAgent trait defines the interface all agents must implement. This includes lifecycle methods (initialize, execute, shutdown) and metadata accessors.
The ReAct pattern combines reasoning and acting. Here’s a simplified version:
use mofa_sdk::react::{ReActAgent, ReActTool};use mofa_sdk::llm::{LLMAgent, LLMAgentBuilder};use std::sync::Arc;#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { // Create LLM agent let llm_agent = Arc::new( LLMAgentBuilder::from_env()? .with_system_prompt("You are a helpful assistant.") .build() ); // Create ReAct agent with tools let react_agent = ReActAgent::builder() .with_llm(llm_agent) .with_tool(Arc::new(WebSearchTool)) .with_tool(Arc::new(CalculatorTool)) .with_max_iterations(5) .build_async() .await?; // Run a task let result = react_agent .run("What is Rust and when was it first released?") .await?; println!("Answer: {}", result.answer); Ok(())}
For simpler use cases, use the high-level LLMAgent:
use mofa_sdk::llm::LLMAgentBuilder;#[tokio::main]async fn main() -> Result<(), Box<dyn std::error::Error>> { // Build agent from environment let agent = LLMAgentBuilder::from_env()? .with_name("My Assistant") .with_system_prompt("You are a helpful coding assistant.") .with_temperature(0.7) .build(); // Simple question-answer let answer = agent.ask("Explain async/await in Rust").await?; println!("{}", answer); // Multi-turn conversation agent.chat("My name is Alice").await?; let response = agent.chat("What's my name?").await?; println!("{}", response); // Should remember "Alice" Ok(())}