Requirements
Node.js
Version 18 or higher required
Docker
Docker Engine 20.10+ and Docker Compose v2
Memory
Minimum 2GB RAM (8GB recommended for full stacks)
Disk space
At least 10GB free space for Docker images and volumes
You don’t need to install better-openclaw globally. Use
npx to run it directly with the latest version.Quick install
The recommended way to use better-openclaw is withnpx:
Verify prerequisites
Check Node.js version
Ensure you have Node.js 18 or higher:If you need to install or update Node.js, visit nodejs.org or use a version manager like nvm:
Check Docker installation
Verify Docker is installed and running:You should see Docker Engine 20.10+ and Docker Compose v2.
Installation methods
Using npx (recommended)
Run better-openclaw without installation:This method always uses the latest version and requires no maintenance or updates.
Global installation
Install globally for repeated use:Using pnpm
If you prefer pnpm:Using yarn
With yarn:Shell completions
Enable tab completion for your shell:Bare-metal deployment
Better OpenClaw supports hybrid native + Docker deployments where supported services run directly on the host:Generate bare-metal stack
Use
--deployment-type bare-metal to create native install scripts alongside docker-compose.yml.Run the installer
Execute the generated installation script:This installs native services (like Redis via apt/dnf) and starts remaining services with Docker.
Currently only Redis supports native Linux installation. More services may be added in future releases.
System resource guidelines
Plan your stack based on available resources:| Stack size | RAM | Disk | Use case |
|---|---|---|---|
| Minimal (1-3 services) | 2 GB | 5 GB | Testing and development |
| Small (4-10 services) | 4 GB | 10 GB | Small teams, specific workflows |
| Medium (11-20 services) | 8 GB | 20 GB | Production workloads, multiple teams |
| Large (21-40 services) | 16 GB | 40 GB | Complete infrastructure, high availability |
| Full Stack (40+ services) | 32 GB | 80 GB | Enterprise deployments, all features |
GPU support for AI services
Enable GPU passthrough for AI services that support it (Ollama, Stable Diffusion, ComfyUI):GPU support is added to docker-compose.yml automatically when using the
--gpu flag. Services will use deploy.resources.reservations.devices to request GPU access.Troubleshooting
Port conflicts
If you see port conflict errors:-
Check what’s using the port:
-
Stop the conflicting service or regenerate with custom ports:
Docker permission errors
If you see permission denied errors:Memory issues
If services are crashing or running slowly:-
Check Docker memory limits:
-
Increase Docker Desktop memory allocation (Mac/Windows):
- Open Docker Desktop preferences
- Navigate to Resources > Advanced
- Increase memory allocation
-
Use a smaller preset or fewer services:
Disk space issues
If you’re running low on disk space:Updating better-openclaw
If you installed globally, update to the latest version:npx with @latest to always get the newest version:
Uninstalling
Remove global installation:Next steps
Quick start guide
Generate your first OpenClaw stack with presets and interactive mode
Service catalog
Explore all 94 available services and their capabilities