Quickstart
Install dependencies and run your first agentic workflow in minutes.
Configuration
Configure LLMs, memory, agents, and tools for your environment.
What is Agentic Patterns?
Agentic Patterns is a Python library (agenticpatterns) that layers three architectural ideas on top of LangChain and LangGraph:
- Role-specific agents — discrete classes for planning, execution, monitoring, and local summarization.
- Workflow patterns — sequential and parallel orchestration strategies that coordinate agents.
- Supporting infrastructure — SQLite short-term memory with exact-match caching, context compression, and web search via
curl.
PlanningAgent into any LangChain chain, use SQLiteShortTermMemory for session persistence in a chatbot, or run the full LangGraphOrchestrator as a self-correcting pipeline.
Key architectural components
Planning agent
PlanningAgent breaks a high-level task into a structured JSON plan — a list of sequential steps passed to the executor.Execution agent
ExecutionAgent carries out individual plan steps. It optionally wraps a LangGraph ReAct agent when tools such as CurlSearchTool are provided.Monitoring agent
MonitoringAgent evaluates each step’s output against its original objective and returns structured JSON feedback with a success flag.Local agent
LocalAgent is a lightweight summarization agent that aggressively condenses context to save tokens for downstream steps.Sequential workflow
SequentialWorkflow runs planning → execution → monitoring in order, threading the output of each step as context into the next.Parallel workflow
ParallelWorkflow executes independent tasks concurrently using a thread pool, returning all results when the batch completes.LangGraph orchestrator
LangGraphOrchestrator builds a LangGraph StateGraph that coordinates planner, executor, and monitor nodes with configurable retry logic.Short-term memory
SQLiteShortTermMemory persists session messages locally and provides exact-match cache lookups to skip redundant LLM calls.CompressContextTool
Strips whitespace, removes common filler words, and truncates text locally — no LLM call required — before injecting context into prompts.
CurlSearchTool
Queries the Wikipedia OpenSearch API via
curl and returns the top three snippets, giving agents access to factual web content.Local and cloud LLM support
Agentic Patterns works with anyBaseChatModel from LangChain. Two configurations are used out of the box:
- Local (Ollama) —
ChatOllamaconnects to a locally running Ollama server. SetLOCAL_MODELandOLLAMA_HOSTin your.envfile. - Cloud (OpenAI-compatible) —
ChatOpenAIwith a custombase_urlconnects to any OpenAI-compatible endpoint such as Routeway or LLM API. SetSUMMARY_HOST,SUMMARY_MODEL, andSUMMARY_AGENT_API_KEY.
Agents are LLM-agnostic. Any
BaseChatModel instance — including Anthropic, Groq, or Mistral — can be passed to any agent constructor.How it fits together
LangGraphOrchestrator encodes this loop as a LangGraph state machine. SequentialWorkflow implements the same pattern in plain Python. Both expose a single .run(task=...) method and return structured result dictionaries.
Get started with the Quickstart.