Skip to main content
Shannon supports multiple AI providers beyond direct Anthropic API access. Route requests through AWS Bedrock, Google Vertex AI, or experimental multi-model providers.

Provider Options

Anthropic API

Direct API access (default, recommended)

AWS Bedrock

Route through Amazon Bedrock

Google Vertex AI

Route through Google Cloud Vertex AI

Router Mode

Experimental multi-model support

Anthropic API (Default)

Direct API access is the simplest and recommended approach.

Setup

1

Get API Key

Sign up at Anthropic Console and create an API key.
2

Configure .env

.env
ANTHROPIC_API_KEY=your-api-key-here
CLAUDE_CODE_MAX_OUTPUT_TOKENS=64000
3

Run Shannon

./shannon start URL=https://example.com REPO=repo-name

OAuth Token Alternative

Use Claude Code OAuth token instead of API key:
.env
CLAUDE_CODE_OAUTH_TOKEN=your-oauth-token-here
CLAUDE_CODE_MAX_OUTPUT_TOKENS=64000

AWS Bedrock

Route requests through Amazon Bedrock instead of direct Anthropic API.

Prerequisites

  • AWS account with Bedrock access
  • Claude models enabled in your AWS region
  • Bedrock API keys (Bearer token)

Setup

1

Generate Bedrock API Key

Follow AWS Bedrock API Keys guide to generate a bearer token.
2

Configure .env

.env
CLAUDE_CODE_USE_BEDROCK=1
AWS_REGION=us-east-1
AWS_BEARER_TOKEN_BEDROCK=your-bearer-token

# Set models with Bedrock-specific IDs for your region
ANTHROPIC_SMALL_MODEL=us.anthropic.claude-haiku-4-5-20251001-v1:0
ANTHROPIC_MEDIUM_MODEL=us.anthropic.claude-sonnet-4-6
ANTHROPIC_LARGE_MODEL=us.anthropic.claude-opus-4-6
3

Run Shannon

./shannon start URL=https://example.com REPO=repo-name
Shannon automatically uses Bedrock for all requests.

Model Tiers

Shannon uses three model tiers with different Bedrock IDs per region:
Used for summarization and light analysis.
ANTHROPIC_SMALL_MODEL=us.anthropic.claude-haiku-4-5-20251001-v1:0
Default: claude-haiku-4-5-20251001

Regional Model IDs

Bedrock model IDs vary by AWS region. Common formats:
ANTHROPIC_SMALL_MODEL=us.anthropic.claude-haiku-4-5-20251001-v1:0
ANTHROPIC_MEDIUM_MODEL=us.anthropic.claude-sonnet-4-6
ANTHROPIC_LARGE_MODEL=us.anthropic.claude-opus-4-6
Regional AvailabilityNot all Claude models are available in all AWS regions. Check the AWS Bedrock Model Catalog for your region’s supported models.

Configuration Reference

CLAUDE_CODE_USE_BEDROCK
string
Enable Bedrock mode (set to 1)
AWS_REGION
string
required
AWS region for Bedrock (e.g., us-east-1, us-west-2)
AWS_BEARER_TOKEN_BEDROCK
string
required
Bedrock API bearer token
ANTHROPIC_SMALL_MODEL
string
required
Bedrock model ID for small tier
ANTHROPIC_MEDIUM_MODEL
string
required
Bedrock model ID for medium tier
ANTHROPIC_LARGE_MODEL
string
required
Bedrock model ID for large tier

Google Vertex AI

Route requests through Google Cloud Vertex AI.

Prerequisites

  • Google Cloud Platform project
  • Vertex AI API enabled
  • Service account with roles/aiplatform.user role

Setup

1

Create Service Account

In the GCP Console:
  1. Create a new service account
  2. Grant roles/aiplatform.user role
  3. Create and download a JSON key file
2

Place Key File

mkdir -p ./credentials
cp /path/to/your-sa-key.json ./credentials/gcp-sa-key.json
The ./credentials/ directory is mounted into the Docker container.
3

Configure .env

.env
CLAUDE_CODE_USE_VERTEX=1
CLOUD_ML_REGION=us-east5
ANTHROPIC_VERTEX_PROJECT_ID=your-gcp-project-id
GOOGLE_APPLICATION_CREDENTIALS=./credentials/gcp-sa-key.json

# Set models with Vertex AI model IDs
ANTHROPIC_SMALL_MODEL=claude-haiku-4-5@20251001
ANTHROPIC_MEDIUM_MODEL=claude-sonnet-4-6
ANTHROPIC_LARGE_MODEL=claude-opus-4-6
4

Run Shannon

./shannon start URL=https://example.com REPO=repo-name

Regional Endpoints

Vertex AI supports global and regional endpoints:
CLOUD_ML_REGION=global
Some models may not be available on global endpoints. Use regional endpoints for full model availability.
Check the Vertex AI Model Garden for region availability.

Model IDs

Vertex AI uses different model ID formats than Anthropic or Bedrock:
ANTHROPIC_SMALL_MODEL=claude-haiku-4-5@20251001
ANTHROPIC_MEDIUM_MODEL=claude-sonnet-4-6
ANTHROPIC_LARGE_MODEL=claude-opus-4-6

Configuration Reference

CLAUDE_CODE_USE_VERTEX
string
Enable Vertex AI mode (set to 1)
CLOUD_ML_REGION
string
required
GCP region or global for global endpoint
ANTHROPIC_VERTEX_PROJECT_ID
string
required
Your GCP project ID
GOOGLE_APPLICATION_CREDENTIALS
string
required
Path to service account JSON key file (must be in ./credentials/)
ANTHROPIC_SMALL_MODEL
string
required
Vertex AI model ID for small tier
ANTHROPIC_MEDIUM_MODEL
string
required
Vertex AI model ID for medium tier
ANTHROPIC_LARGE_MODEL
string
required
Vertex AI model ID for large tier

Router Mode (Experimental)

Experimental FeatureRouter mode is experimental and unsupported. It’s intended for model experimentation only. Output quality varies significantly by model and may fail early phases.Shannon is optimized for Anthropic Claude models. Alternative providers may produce inconsistent results.

Overview

Router mode uses claude-code-router to route requests through alternative AI providers:
  • OpenAI: GPT-5.2, GPT-5-mini
  • OpenRouter: Google Gemini 3 Flash Preview

Setup

1

Configure Provider

Add your provider API key to .env:
.env
OPENAI_API_KEY=sk-...
ROUTER_DEFAULT=openai,gpt-5.2
2

Run with ROUTER=true

./shannon start URL=https://example.com REPO=repo-name ROUTER=true
The ROUTER=true flag starts the router container and routes all requests through it.

Available Models

ROUTER_DEFAULT=openai,gpt-5.2
Models:
  • gpt-5.2
  • gpt-5-mini

Router Configuration

Router settings are defined in configs/router-config.json:
configs/router-config.json
{
  "HOST": "0.0.0.0",
  "APIKEY": "shannon-router-key",
  "Providers": [
    {
      "name": "openai",
      "api_base_url": "https://api.openai.com/v1/chat/completions",
      "api_key": "$OPENAI_API_KEY",
      "models": ["gpt-5.2", "gpt-5-mini"]
    },
    {
      "name": "openrouter",
      "api_base_url": "https://openrouter.ai/api/v1/chat/completions",
      "api_key": "$OPENROUTER_API_KEY",
      "models": ["google/gemini-3-flash-preview"]
    }
  ],
  "Router": {
    "default": "$ROUTER_DEFAULT"
  }
}

Limitations

Alternative models may:
  • Fail reconnaissance phase
  • Miss vulnerability patterns
  • Generate invalid exploits
  • Produce hallucinated findings
Router mode is provided as-is for experimentation. Issues specific to alternative models will not be addressed.
Model performance depends heavily on:
  • Provider API reliability
  • Model capabilities (reasoning, tool use)
  • Request routing latency

Provider Comparison

Anthropic API
Recommended
Pros:
  • Simplest setup
  • Best performance
  • Direct support
  • No intermediary services
Cons:
  • Requires Anthropic account
AWS Bedrock
Pros:
  • Use existing AWS infrastructure
  • Unified billing with AWS
  • Enterprise compliance features
Cons:
  • More complex setup
  • Regional model availability varies
  • Requires AWS account and Bedrock access
Google Vertex AI
Pros:
  • Use existing GCP infrastructure
  • Unified billing with GCP
  • Service account security model
Cons:
  • More complex setup
  • Regional model availability varies
  • Requires GCP project
Router Mode
Experimental
Pros:
  • Experiment with alternative models
  • Test GPT or Gemini performance
Cons:
  • Unsupported
  • Inconsistent results
  • May fail early phases
  • Complex debugging

Troubleshooting

Error:
ERROR: Bedrock mode requires the following env vars in .env:
AWS_REGION ANTHROPIC_SMALL_MODEL ANTHROPIC_MEDIUM_MODEL ANTHROPIC_LARGE_MODEL
Solution: Set all three model tier variables with Bedrock-specific IDs for your region.
Error:
ERROR: Service account key file not found: ./credentials/gcp-sa-key.json
Solution: Ensure the key file is placed in ./credentials/ directory (not elsewhere).
Error:
ERROR: Permission denied on Vertex AI API
Solution: Verify service account has roles/aiplatform.user role in GCP IAM.
Error:
WARNING: No provider API key set (OPENAI_API_KEY or OPENROUTER_API_KEY)
Solution: Set the appropriate provider API key in .env before running with ROUTER=true.
Issue: Reconnaissance or analysis phases fail with alternative models.Solution: Router mode is experimental. Switch back to Anthropic Claude models for reliable results.

Configuration

Complete YAML configuration reference

Authentication

Setup login flows and 2FA

Build docs developers (and LLMs) love