Skip to main content

Overview

This guide walks you through self-hosting the Hive agent framework on your own infrastructure. You’ll learn how to install dependencies, configure the environment, and run agents locally or on a server.

Prerequisites

  • Python 3.11+ - Required for the core framework
  • Node.js 20+ - Required for the web dashboard (optional)
  • uv - Fast Python package installer (installed automatically by quickstart)

Quick Start

The easiest way to get started is with the interactive quickstart script:
1

Clone the Repository

git clone https://github.com/yourusername/hive.git
cd hive
2

Run Quickstart

bash quickstart.sh
The script will:
  • Verify Python 3.11+ installation
  • Install uv package manager if missing
  • Install workspace packages (core + tools)
  • Install Playwright browser for web scraping
  • Configure your LLM provider
  • Initialize the credential store
  • Build the frontend dashboard
  • Install the hive CLI globally
3

Verify Installation

hive --version

Manual Installation

If you prefer manual setup, follow these steps:

1. Install Python Dependencies

# Install uv (if not already installed)
curl -LsSf https://astral.sh/uv/install.sh | sh
export PATH="$HOME/.local/bin:$PATH"

# Install workspace packages
cd /path/to/hive
uv sync

# Install Playwright browser
uv run python -m playwright install chromium

2. Configure LLM Provider

Set your API key as an environment variable:
# For Anthropic (Claude)
export ANTHROPIC_API_KEY="your-api-key-here"

# For OpenAI
export OPENAI_API_KEY="your-api-key-here"

# For Google Gemini
export GEMINI_API_KEY="your-api-key-here"
Create configuration file at ~/.hive/configuration.json:
{
  "llm": {
    "provider": "anthropic",
    "model": "claude-opus-4-6",
    "max_tokens": 32768,
    "api_key_env_var": "ANTHROPIC_API_KEY"
  },
  "gcu_enabled": true,
  "created_at": "2026-03-03T00:00:00+00:00"
}

3. Initialize Credential Store

Generate an encryption key for the credential store:
# Generate encryption key
export HIVE_CREDENTIAL_KEY=$(uv run python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())")

# Save to your shell config
echo "export HIVE_CREDENTIAL_KEY='$HIVE_CREDENTIAL_KEY'" >> ~/.bashrc
The credential store will be initialized at ~/.hive/credentials/.

4. Build Frontend (Optional)

If you want the web dashboard:
cd core/frontend
npm install
npm run build

5. Install CLI

Make the hive CLI globally accessible:
mkdir -p ~/.local/bin
ln -s $(pwd)/hive ~/.local/bin/hive

# Add to PATH if needed
export PATH="$HOME/.local/bin:$PATH"

Directory Structure

After installation, your Hive installation will have this structure:
~/.hive/
├── configuration.json      # LLM and runtime config
├── credentials/            # Encrypted credential storage
│   ├── credentials/        # Encrypted credential files
│   └── metadata/           # Credential metadata index
├── secrets/
│   └── credential_key      # Encryption key (chmod 600)
└── agents/                 # Agent storage directories
    └── {agent-name}/
        ├── browser/        # Browser profiles
        └── storage/        # Agent-specific data

Running Agents

CLI Interface

Run an agent from the command line:
# Run an exported agent
uv run python -m framework.runner.cli exports/my-agent \
  --input '{"query": "Research quantum computing"}'

TUI Dashboard

Launch the interactive terminal dashboard:
hive tui
The TUI allows you to:
  • Browse available agents
  • Start/stop agent execution
  • View real-time logs
  • Monitor token usage

Web Dashboard

Start the web server:
hive serve

# Or with custom port
hive serve --port 8080

# Auto-open browser
hive serve --open
Access the dashboard at http://localhost:8787.

Environment Variables

Required

HIVE_CREDENTIAL_KEY
string
required
Fernet encryption key for credential storage. Generate with:
python -c "from cryptography.fernet import Fernet; print(Fernet.generate_key().decode())"

Optional

ANTHROPIC_API_KEY
string
Anthropic API key for Claude models
OPENAI_API_KEY
string
OpenAI API key for GPT models
GEMINI_API_KEY
string
Google Gemini API key
BRAVE_SEARCH_API_KEY
string
Brave Search API key for web search tools
HIVE_STORAGE_PATH
string
Custom storage path for agent data. Defaults to ~/.hive/agents/{agent-name}
HIVE_AGENT_NAME
string
Agent name for storage paths. Set automatically by AgentRunner.

Docker Deployment

Dockerfile

Create a Dockerfile for containerized deployment:
FROM python:3.12-slim

# Install system dependencies
RUN apt-get update && apt-get install -y \
    curl \
    git \
    && rm -rf /var/lib/apt/lists/*

# Install uv
RUN curl -LsSf https://astral.sh/uv/install.sh | sh
ENV PATH="/root/.local/bin:$PATH"

# Copy source
WORKDIR /app
COPY . .

# Install dependencies
RUN uv sync
RUN uv run python -m playwright install chromium --with-deps

# Expose ports
EXPOSE 8787

# Run dashboard
CMD ["./hive", "serve", "--host", "0.0.0.0"]

Docker Compose

version: '3.8'

services:
  hive:
    build: .
    ports:
      - "8787:8787"
    environment:
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY}
      - HIVE_CREDENTIAL_KEY=${HIVE_CREDENTIAL_KEY}
    volumes:
      - ./agents:/app/agents
      - hive-data:/root/.hive

volumes:
  hive-data:
Run with:
docker-compose up -d

Production Considerations

Security

  • Store HIVE_CREDENTIAL_KEY in a secrets manager (AWS Secrets Manager, HashiCorp Vault)
  • Never commit credential files to version control
  • Use encrypted volumes for ~/.hive/credentials/
  • Rotate encryption keys periodically
  • Use environment variables, never hardcode
  • Implement rate limiting for API calls
  • Monitor API usage and set alerts
  • Use separate keys for dev/staging/prod
  • Run dashboard behind a reverse proxy (nginx, Caddy)
  • Enable HTTPS with valid certificates
  • Implement authentication for web access
  • Use firewall rules to restrict access

Performance

  • LLM Caching: Enable prompt caching for providers that support it (Anthropic Claude)
  • Browser Sessions: Reuse browser contexts instead of launching new browsers
  • Concurrent Agents: Use async execution for multiple agents
  • Resource Limits: Set memory/CPU limits in Docker

Monitoring

# Enable debug logging
import logging
logging.basicConfig(level=logging.DEBUG)

# Monitor token usage
result = await runner.run(input_data)
print(f"Input tokens: {result.metrics.input_tokens}")
print(f"Output tokens: {result.metrics.output_tokens}")

Troubleshooting

Common Issues

Error: Python 3.11+ is requiredSolution: Install Python 3.11 or newer:
# macOS
brew install [email protected]

# Ubuntu/Debian
sudo apt install python3.12
Error: playwright install chromium failedSolution: Install system dependencies:
# Ubuntu/Debian
sudo apt-get install -y libnss3 libatk1.0-0 libatk-bridge2.0-0

# Then retry
uv run python -m playwright install chromium --with-deps
Error: Failed to decrypt credentialSolution: Verify HIVE_CREDENTIAL_KEY is set correctly:
echo $HIVE_CREDENTIAL_KEY
# Should print a base64-encoded key

# If empty, regenerate and reconfigure credentials
Error: Address already in use: 8787Solution: Use a different port:
hive serve --port 8080

Next Steps

Credential Management

Learn how to securely store and manage API credentials

LLM Providers

Configure different LLM providers and models

MCP Integration

Connect external MCP servers for tool access

Browser Automation

Enable browser control for web automation tasks

Build docs developers (and LLMs) love