Quickstart
Install the package and run your first agentic workflow in minutes.
Core Concepts
Understand agents, workflows, orchestration, memory, and tools.
Guides
Step-by-step guides for building common agentic patterns.
API Reference
Full API documentation for every class and method.
What’s included
Modular Agents
PlanningAgent, ExecutionAgent, MonitoringAgent, and LocalAgent — each with a focused role and a clean interface.Workflow Patterns
SequentialWorkflow for ordered execution and ParallelWorkflow for concurrent multi-task processing.LangGraph Orchestration
A LangGraph state machine that coordinates planning, execution, and monitoring with automatic retry logic.
Short-Term Memory
SQLite-backed memory with exact-match caching to avoid redundant LLM calls across sessions.
Context Compression
CompressContextTool reduces prompt size locally — no extra LLM call required.Local & Cloud LLMs
Works with Ollama for local models and any OpenAI-compatible API for cloud inference.
How it works
Initialize an LLM
Configure a local Ollama model or a cloud OpenAI-compatible endpoint via environment variables.
Choose a workflow or orchestrator
Use
SequentialWorkflow for ordered pipelines, ParallelWorkflow for concurrent tasks, or LangGraphOrchestrator for a full state-machine with retries.