What It Does
The Add Ollama Tool skill:- Adds Ollama MCP server to agent-runner
- Exposes tools to list and run local Ollama models
- Enables Claude to delegate tasks to local models
- Provides notification watcher for macOS
Prerequisites
- NanoClaw base installation complete
- Ollama installed on host machine
- At least one Ollama model pulled (e.g.,
gemma3:1b,llama3.2)
How to Apply
Install Ollama
If not already installed:
- Download from https://ollama.com/download
- Install and start Ollama
- Pull a model:
Apply code changes
The skill runs
npx tsx scripts/apply-skill.ts .claude/skills/add-ollama-tool which:- Adds
container/agent-runner/src/ollama-mcp-stdio.ts - Adds
scripts/ollama-watch.sh(notification watcher) - Merges Ollama MCP config into agent-runner
- Merges log surfacing into container-runner
What Changes
Files Created
container/agent-runner/src/ollama-mcp-stdio.ts- Ollama MCP serverscripts/ollama-watch.sh- macOS notification watcher
Files Modified
container/agent-runner/src/index.ts- Adds Ollama MCP server to allowedTools and mcpServerssrc/container-runner.ts- Surfaces[OLLAMA]logs to host.nanoclaw/state.yaml- Records skill application
Usage
Tools Available
ollama_list_models- Lists installed Ollama modelsollama_generate- Sends prompt to specified model and returns response
Example Requests
When Claude Uses Ollama
Claude automatically delegates to Ollama for:- Quick factual queries
- Summarization
- Translation
- Simple code tasks
- Repetitive operations
- Tool use and orchestration
- Complex reasoning
- File operations
- Final response formatting
Ollama connects to
http://host.docker.internal:11434 by default (Docker Desktop). Set OLLAMA_HOST in .env for custom hosts.Optional: Activity Monitoring
Run the watcher for macOS notifications when Ollama is used:Troubleshooting
Agent Says “Ollama is not installed”
The agent is trying to runollama CLI inside the container instead of using MCP tools. This means:
- MCP server wasn’t registered - check
container/agent-runner/src/index.tshasollamaentry - Per-group source wasn’t updated - re-copy files (see Step 4)
- Container wasn’t rebuilt - run
./container/build.sh
”Failed to connect to Ollama”
- Verify Ollama is running:
- Check Docker can reach host:
- If using custom host, check
OLLAMA_HOSTin.env