Prerequisites
Before you begin, ensure you have:- Rust stable toolchain (edition 2024 - requires Rust ≥ 1.85)
- Git for cloning the repository
- An OpenAI API key or access to a compatible LLM provider
Set Up Environment Variables
Create a
.env file in your project root to store your API credentials:The
.env file will be automatically loaded by the dotenvy crate.Define Your Agent Structure
Create your agent by implementing the
MoFAAgent trait. Here’s a complete example:src/main.rs
Understanding the Structure
- id: Unique identifier for your agent
- name: Human-readable name
- capabilities: Describes what your agent can do (used for discovery and routing)
- state: Current lifecycle state of the agent
- client: LLM client for making API calls
Implement the MoFAAgent Trait
Now implement the required trait methods:
src/main.rs
Core Methods Explained
- initialize: Prepare the agent for execution (load resources, establish connections)
- execute: The main task execution method - processes input and returns output
- shutdown: Clean up resources gracefully
Complete Example
Here’s the complete code for reference:src/main.rs
Next Steps
LLM Integration
Learn how to integrate different LLM providers (OpenAI, Anthropic, Ollama, Gemini)
Agent Lifecycle
Understand the full agent lifecycle (pause, resume, interrupt)
Capabilities
Master agent capabilities and state management
API Reference
Explore the complete MoFA API documentation
Common Issues
Error: OPENAI_API_KEY not found
Error: OPENAI_API_KEY not found
Make sure you’ve created a
.env file in your project root with your API key:Compilation error: edition 2024
Compilation error: edition 2024
MoFA requires Rust 1.85 or newer. Update your toolchain:
Network timeout errors
Network timeout errors
If you’re using a proxy or firewall, you may need to configure your network settings. You can also increase the timeout in the LLM configuration.