Skip to main content
OpenFang supports multiple installation methods across all major platforms. Choose the method that works best for your environment.

Installation Methods

Shell Installer

One-line install for macOS/Linux

Windows PowerShell

PowerShell installer for Windows

Docker

Containerized deployment

Cargo

Build from source with Rust

Shell Installer (macOS/Linux)

The fastest way to get started on Unix-like systems:
curl -fsSL https://openfang.sh/install | sh
This script:
  1. Downloads the latest CLI binary for your platform
  2. Installs it to ~/.openfang/bin/
  3. Adds the binary to your PATH
  4. Verifies the installation
The installer is non-invasive and only modifies files in ~/.openfang/.

Manual Download

If you prefer to download manually, get the latest release from GitHub Releases:
# Download for your platform
wget https://github.com/RightNow-AI/openfang/releases/latest/download/openfang-linux-x86_64

# Make executable
chmod +x openfang-linux-x86_64

# Move to PATH
sudo mv openfang-linux-x86_64 /usr/local/bin/openfang

Windows PowerShell Installer

For Windows users, run this command in PowerShell:
irm https://openfang.sh/install.ps1 | iex
This script:
  1. Downloads the latest Windows binary
  2. Verifies the SHA256 checksum
  3. Installs to %USERPROFILE%\.openfang\bin\
  4. Adds the directory to your user PATH
You may need to restart your terminal or run refreshenv for PATH changes to take effect.

Manual Installation on Windows

  1. Download openfang-windows-x86_64.exe from GitHub Releases
  2. Rename to openfang.exe
  3. Move to a directory in your PATH (e.g., C:\Program Files\OpenFang\)
  4. Add that directory to your system PATH if needed

Desktop App (Windows/macOS/Linux)

OpenFang also provides native desktop apps with system tray integration, auto-updates, and OS notifications:
Download the .msi installer from GitHub Releases and run it.
The desktop app includes the full CLI as well. You can use both the GUI and terminal commands.

Docker

Run OpenFang in a container for isolated deployment.

Using Docker Run

docker pull ghcr.io/rightnow-ai/openfang:latest

docker run -d \
  --name openfang \
  -p 4200:4200 \
  -e ANTHROPIC_API_KEY=$ANTHROPIC_API_KEY \
  -e OPENAI_API_KEY=$OPENAI_API_KEY \
  -e GROQ_API_KEY=$GROQ_API_KEY \
  -v openfang-data:/data \
  ghcr.io/rightnow-ai/openfang:latest
Note: The GHCR image is not yet public as of v0.1.0. For now, build from source using Docker Compose (see below).

Using Docker Compose

Clone the repository and use the included docker-compose.yml:
git clone https://github.com/RightNow-AI/openfang.git
cd openfang

# Set your API keys in environment or .env file
echo "ANTHROPIC_API_KEY=sk-ant-..." > .env
echo "OPENAI_API_KEY=sk-..." >> .env
echo "GROQ_API_KEY=gsk_..." >> .env

# Build and start
docker compose up -d
The docker-compose.yml includes:
docker-compose.yml
version: "3.8"
services:
  openfang:
    build: .
    ports:
      - "4200:4200"
    volumes:
      - openfang-data:/data
    environment:
      - ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY:-}
      - OPENAI_API_KEY=${OPENAI_API_KEY:-}
      - GROQ_API_KEY=${GROQ_API_KEY:-}
      - TELEGRAM_BOT_TOKEN=${TELEGRAM_BOT_TOKEN:-}
      - DISCORD_BOT_TOKEN=${DISCORD_BOT_TOKEN:-}
      - SLACK_BOT_TOKEN=${SLACK_BOT_TOKEN:-}
      - SLACK_APP_TOKEN=${SLACK_APP_TOKEN:-}
    restart: unless-stopped

volumes:
  openfang-data:

Docker Environment Variables

The Docker container supports these environment variables:
VariableDescription
ANTHROPIC_API_KEYAnthropic (Claude) API key
OPENAI_API_KEYOpenAI API key
GROQ_API_KEYGroq API key
TELEGRAM_BOT_TOKENTelegram bot token for channel adapter
DISCORD_BOT_TOKENDiscord bot token for channel adapter
SLACK_BOT_TOKENSlack bot token for channel adapter
SLACK_APP_TOKENSlack app token for channel adapter
OPENFANG_HOMEData directory (defaults to /data)

Access the Dashboard

With the container running, open your browser to:
http://localhost:4200/

Cargo Install (Build from Source)

Requires Rust 1.75+.

Install Rust

If you don’t have Rust installed:
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh

Install from GitHub

cargo install --git https://github.com/RightNow-AI/openfang openfang-cli
This compiles the openfang binary and installs it to ~/.cargo/bin/ (which should be in your PATH).

Build from Source (Development)

For development or to build a specific commit:
git clone https://github.com/RightNow-AI/openfang.git
cd openfang

# Build the CLI binary
cargo build --release -p openfang-cli

# Binary is at: target/release/openfang

# Optionally install to ~/.cargo/bin/
cargo install --path crates/openfang-cli
Building from source requires several system dependencies including pkg-config, libssl-dev, and a C compiler. See the project README for full build requirements.

Run Tests

OpenFang includes 1,767+ tests with zero clippy warnings:
# Build workspace
cargo build --workspace --lib

# Run all tests
cargo test --workspace

# Lint (must be 0 warnings)
cargo clippy --workspace --all-targets -- -D warnings

# Format check
cargo fmt --all -- --check

Verify Installation

Regardless of installation method, verify that OpenFang is installed correctly:
openfang --version
Expected output:
openfang 0.3.25
Run the diagnostic tool:
openfang doctor
Expected output:
✓ Config file exists at ~/.openfang/config.toml
✓ API keys configured
✓ Toolchain available
All checks passed!

Environment Setup

Initialize Configuration

Create the default configuration:
openfang init
This creates:
~/.openfang/
  config.toml    # Main configuration
  data/          # Database and runtime data
  agents/        # Agent manifests (optional)

Set API Keys

OpenFang needs at least one LLM provider API key. Set it as an environment variable:
export ANTHROPIC_API_KEY=sk-ant-...
Add the export to your shell profile to persist it:
echo 'export ANTHROPIC_API_KEY=sk-ant-...' >> ~/.bashrc
source ~/.bashrc

Edit Configuration

The default config uses Anthropic. To change the provider, edit ~/.openfang/config.toml:
config.toml
[default_model]
provider = "groq"                      # anthropic, openai, groq, ollama, etc.
model = "llama-3.3-70b-versatile"      # Model identifier for the provider
api_key_env = "GROQ_API_KEY"           # Env var holding the API key

[memory]
decay_rate = 0.05                      # Memory confidence decay rate

[network]
listen_addr = "127.0.0.1:4200"        # OFP listen address

Platform-Specific Notes

macOS

On macOS, the first time you run openfang, you may see a security warning. Click “Open” in System Preferences → Security & Privacy.
If you installed via Homebrew (community tap), update with:
brew upgrade openfang

Linux

Ensure libssl is installed:
sudo apt-get update
sudo apt-get install libssl-dev ca-certificates

Windows

Windows Defender may scan the binary on first run, causing a slight delay. This is normal.
If you see “vcruntime140.dll missing”, install the Visual C++ Redistributable.

Next Steps

Quick Start Guide

Follow the quick start guide to spawn your first agent in under 5 minutes.

Configuration

Learn about config.toml options, model routing, and provider setup.

Agent Templates

Explore 30 pre-built agent templates for common use cases.

Channel Adapters

Connect agents to 40 messaging platforms.

API Reference

Explore 140+ REST/WS/SSE endpoints.

Troubleshooting

Command not found

If openfang is not found after installation:
Check if ~/.openfang/bin is in your PATH:
echo $PATH | grep openfang
If not, add it to your shell profile:
echo 'export PATH="$HOME/.openfang/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc

Permission denied

On Linux/macOS, ensure the binary is executable:
chmod +x ~/.openfang/bin/openfang

Port already in use

If port 4200 is already in use, edit ~/.openfang/config.toml to change the listen address:
[network]
listen_addr = "127.0.0.1:4201"  # Change to any available port

Build docs developers (and LLMs) love