Overview
The AWS Bedrock provider enables ZeroClaw to use foundation models via AWS Bedrock’s Converse API. It uses AWS Signature Version 4 (SigV4) authentication with support for EC2 instance metadata, ECS container credentials, and environment variables. Provider ID:bedrock
Alias: aws-bedrock
Service: bedrock (for SigV4 signing)
Endpoint: https://bedrock-runtime.{region}.amazonaws.com
API: Converse API
Authentication
Environment Variables
Credentials are resolved in the following order:-
Environment variables:
AWS_ACCESS_KEY_ID(required)AWS_SECRET_ACCESS_KEY(required)AWS_SESSION_TOKEN(optional, for temporary credentials)AWS_REGIONorAWS_DEFAULT_REGION(default:us-east-1)
-
ECS container credentials:
AWS_CONTAINER_CREDENTIALS_RELATIVE_URI(ECS/Fargate)AWS_CONTAINER_CREDENTIALS_FULL_URI(ECS Anywhere)AWS_CONTAINER_AUTHORIZATION_TOKEN(if required)
-
EC2 instance metadata (IMDSv2):
- Fetches temporary credentials from instance IAM role
- Requires network access to
169.254.169.254
Credential Caching
Credentials are cached for 50 minutes to reduce metadata service calls:Configuration
Config File
Environment Setup
Cross-Region Inference
Use cross-region inference profiles:Features
Native Tool Calling
Supported: Yes Bedrock’s Converse API uses a nested tool format:Vision Support
Supported: Yes Images are sent as base64-encoded bytes:image/png→pngimage/gif→gifimage/webp→webp- Others →
jpeg
Prompt Caching
Supported: Yes Bedrock usescachePoint blocks for prompt caching:
System Prompt Caching
System prompts larger than 3KB are automatically cached:Conversation Caching
Conversations with more than 4 non-system messages cache the last message:Streaming Support
Supported: Yes Use theconverse-stream endpoint for real-time responses:
Token Usage Tracking
Supported: Yes Usage data is extracted from response:API Endpoints
Converse
Endpoint:POST /model/{modelId}/converse
Model ID Format: anthropic.claude-sonnet-4-6 or us.anthropic.claude-*
Request:
Converse Stream
Endpoint:POST /model/{modelId}/converse-stream
Response: AWS EventStream binary format with contentBlockDelta events.
Request Configuration
Max Tokens
Default: 4096 Configured viainferenceConfig.maxTokens.
Temperature
Range: 0.0 - 1.0 Default: 0.7 (from config)Timeouts
- Request timeout: 120 seconds
- Connection timeout: 10 seconds
Message Format
System Blocks
Sent as array of blocks:User Messages
Text only:Assistant Messages
Text only:Tool Results
Sent as user message withtoolResult blocks:
AWS SigV4 Signing
Signature Process
- Create canonical request
- Create string to sign
- Derive signing key
- Calculate signature
Canonical URI
Model IDs with colons (e.g.,v1:0) are percent-encoded:
Authorization Header
Session Token
For temporary credentials (STS, ECS, EC2):Stop Reasons
Normalized stop reasons:| Bedrock | ZeroClaw Normalized |
|---|---|
end_turn | EndTurn |
max_tokens | MaxTokens |
stop_sequence | StopSequence |
tool_use | ToolUse |
content_filtered | ContentFilter |
Error Handling
Authentication Errors
Region Errors
Default region isus-east-1. Override with:
Model Not Found
Ensure model ID matches Bedrock format:Provider Capabilities
Model Discovery
Bedrock does not provide a public models endpoint. Use AWS CLI:Example Usage
Simple Chat
With Tools
With Vision
Streaming
EC2 Instance Setup
For EC2 instances, attach an IAM role with Bedrock permissions:ECS/Fargate Setup
For ECS tasks, assign a task IAM role with Bedrock permissions. The provider automatically fetches credentials from the ECS metadata endpoint.Limitations
- Max tokens is fixed at 4096 (not configurable per request)
- Credential refresh is automatic (50-minute TTL)
- Cross-region inference requires specific model ID format
- No public model discovery endpoint
- CRC validation is skipped for EventStream (relies on TLS)