Skip to main content

System requirements

Before installing Strix, ensure your system meets these requirements:
  • Python 3.12 or higher (3.13 and 3.14 supported)
  • Docker installed and running
  • Operating system: Linux, macOS, or Windows (with WSL2)
  • Memory: 4GB minimum (8GB recommended)
  • Disk space: 5GB for Docker images and scan results

Installation methods

Docker setup

Strix requires Docker to create isolated security testing environments.

Install Docker

1

Download Docker Desktop

Download and install Docker Desktop for your platform:
2

Start Docker

Launch Docker Desktop and ensure it’s running:
docker ps
You should see an empty list of containers (or your existing containers).
3

Pull the sandbox image (optional)

The Docker image downloads automatically on first run, but you can pull it manually:
docker pull ghcr.io/usestrix/sandbox:latest
The image is approximately 2GB.

Docker configuration

Strix uses these Docker settings from your environment:
# Optional: customize Docker socket path (default: /var/run/docker.sock)
export DOCKER_HOST="unix:///var/run/docker.sock"

# Optional: customize image (default: ghcr.io/usestrix/sandbox:latest)
export STRIX_IMAGE="ghcr.io/usestrix/sandbox:latest"

LLM provider configuration

Strix requires an LLM provider for the AI agents. Configure your provider with environment variables:

Required configuration

# The LLM model to use (required)
export STRIX_LLM="openai/gpt-5"

# API key for most providers (not needed for Vertex AI, AWS Bedrock)
export LLM_API_KEY="your-api-key-here"

Provider examples

export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="sk-..."
For best results, use these models:
  • OpenAI GPT-5openai/gpt-5
  • Anthropic Claude Sonnet 4.6anthropic/claude-sonnet-4-6
  • Google Gemini 3 Pro Previewvertex_ai/gemini-3-pro-preview
See LLM Providers for the complete list of supported models and configuration options.

Optional configuration

# Custom API base URL (for local models, proxies, etc.)
export LLM_API_BASE="http://localhost:11434"

# Perplexity API key for web search capabilities
export PERPLEXITY_API_KEY="pplx-..."

# Reasoning effort: none, minimal, low, medium, high, xhigh (default: high)
export STRIX_REASONING_EFFORT="high"

# LLM timeout in seconds (default: 300)
export LLM_TIMEOUT="300"

Configuration persistence

Strix automatically saves your configuration to ~/.strix/cli-config.json after the first successful run. This means you only need to set environment variables once.

View saved configuration

cat ~/.strix/cli-config.json

Override saved configuration

You can override the saved config with:
  1. Environment variables - Set before running Strix
  2. Custom config file - Use --config flag:
strix --target ./app --config /path/to/custom-config.json

Configuration priority

Strix uses this priority order (highest to lowest):
  1. Command-line --config file
  2. Environment variables
  3. Saved config at ~/.strix/cli-config.json

Verify installation

Confirm everything is set up correctly:
1

Check Strix version

strix --version
Should output: strix 0.8.2 (or your installed version)
2

Verify Docker

docker ps
Should show your running containers (or empty list if none)
3

Test LLM connection

Strix tests your LLM connection on startup. Run a quick scan to verify:
# Create a test directory
mkdir test-app && cd test-app
echo "print('hello')" > app.py

# Run Strix (will validate LLM connection)
strix --target .
If the LLM connection fails, you’ll see a clear error message.

Troubleshooting

If strix command is not found after installation:Using pip:
# Ensure pip bin directory is in PATH
python -m pip show strix-agent

# Run directly with Python
python -m strix.interface.main --version
Using install script:
# Re-run the install script
curl -sSL https://strix.ai/install | bash
source ~/.bashrc  # or ~/.zshrc
Strix requires Python 3.12+. Check your version:
python --version
If you have multiple Python versions, specify the version:
python3.12 -m pip install strix-agent
Or use pyenv to manage Python versions.
If you see “permission denied” errors with Docker:Linux:
# Add your user to the docker group
sudo usermod -aG docker $USER
newgrp docker
macOS/Windows: Ensure Docker Desktop is running with proper permissions.
If Strix complains about missing configuration:
# Set required variables
export STRIX_LLM="openai/gpt-5"
export LLM_API_KEY="your-key"

# Make permanent (add to ~/.bashrc or ~/.zshrc)
echo 'export STRIX_LLM="openai/gpt-5"' >> ~/.bashrc
echo 'export LLM_API_KEY="your-key"' >> ~/.bashrc

Next steps

Quickstart

Run your first security scan in minutes

Basic usage

Learn the core commands and workflows

LLM providers

Configure different LLM providers and models

Environment variables

Complete reference for all configuration options
Make sure Docker is running before executing Strix commands. The CLI will fail with a clear error if Docker is not available.

Build docs developers (and LLMs) love