~/.strix/cli-config.json, so you don’t need to set these variables on every run.
Required Variables
Model name to use with LiteLLM (e.g.,
openai/gpt-5, anthropic/claude-sonnet-4-6).For Strix Router, use the strix/ prefix (e.g., strix/gpt-5).LLM Configuration
API key for your LLM provider.Not required for local models (Ollama, LMStudio), cloud providers with built-in authentication (Vertex AI, AWS Bedrock), or when using Strix Router.
Custom API base URL for your LLM provider.Use this when running local models (Ollama, LMStudio) or custom endpoints. For
strix/ models, this is automatically set to Strix Router.OpenAI-specific API base URL. This is an alternative to
LLM_API_BASE for OpenAI-compatible endpoints.LiteLLM-specific base URL. This is an alternative to
LLM_API_BASE when using LiteLLM proxy.Ollama-specific API base URL. This is an alternative to
LLM_API_BASE when using Ollama.Control the reasoning effort level for LLM responses.Valid values:
none, minimal, low, medium, high, xhighHigher values result in more thorough analysis but slower responses. For quick scans, use medium.Maximum number of retries for LLM API calls when encountering transient errors.
Timeout in seconds for LLM API requests.
Timeout in seconds for the memory compression operation.
Tool & Feature Configuration
API key for Perplexity AI web search.Enables real-time research and OSINT capabilities during penetration testing. Get your API key at perplexity.ai.
Disable browser automation tool.Set to
true to prevent Strix from using browser automation (Playwright). This disables testing for XSS, CSRF, and other client-side vulnerabilities.Runtime Configuration
Docker image to use for the Strix sandbox environment.You can override this to use a custom sandbox image or a specific version.
Runtime backend for Strix sandboxes.Currently, only
docker is supported.Timeout in seconds for sandbox tool execution.This controls how long Strix will wait for individual tool executions (terminal commands, Python scripts, etc.) to complete.
Timeout in seconds for connecting to the sandbox tool server.
Docker daemon connection URL.By default, Strix uses the local Docker daemon. Set this to connect to a remote Docker host.
Telemetry
Enable or disable anonymous telemetry.Set to
0 to opt out of telemetry collection. Telemetry helps improve Strix by collecting anonymous usage statistics.Example Configurations
OpenAI GPT-5
Anthropic Claude
Strix Router (Multiple Providers)
Ollama (Local)
With Web Search
Configuration Persistence
Strix automatically saves your environment variables to~/.strix/cli-config.json. This means you only need to set these variables once, and Strix will remember them for future runs.
To override saved configuration temporarily, you can:
- Set environment variables in your current shell session
- Use a custom config file with the
--configflag - Clear a variable by setting it to an empty string:
export STRIX_LLM=""
LLM configuration variables (
STRIX_LLM, LLM_API_KEY, etc.) are re-validated on each run. If you change these in your environment, Strix will use the new values and update the saved configuration.