Skip to main content
Ask mode enables you to query your AWS infrastructure using natural language. Clanker analyzes your question, gathers relevant AWS context, and provides intelligent answers.

How it works

Ask mode operates in three phases:
1

Routing

Clanker determines which cloud provider (AWS/GCP/Azure) the question targets
2

Context gathering

Relevant AWS service data is collected based on keywords in your question
3

AI analysis

The AI processes the context and provides an answer with tool calling support

Basic usage

Implicit AWS detection

clanker ask "What EC2 instances are running?"
clanker ask "Show me lambda functions with high error rates"
clanker ask "What's the current RDS instance status?"
Clanker automatically detects AWS-related queries through keyword inference.

Explicit AWS flag

clanker ask "List all resources" --aws
clanker ask "Show infrastructure overview" --aws --discovery
Use --aws to explicitly enable AWS context gathering.

Context gathering

Clanker intelligently determines which AWS services to query based on your question:

Service detection

From ~/workspace/source/internal/aws/client.go:198-262:
func (c *Client) GetRelevantContext(ctx context.Context, question string) (string, error) {
    var context strings.Builder
    questionLower := strings.ToLower(question)

    if strings.Contains(questionLower, "ec2") || strings.Contains(questionLower, "instance") {
        ec2Info, err := c.getEC2Info(ctx)
        context.WriteString("EC2 Instances:\n")
        context.WriteString(ec2Info)
    }

    if strings.Contains(questionLower, "lambda") || strings.Contains(questionLower, "function") {
        lambdaInfo, err := c.getLambdaInfo(ctx)
        context.WriteString("Lambda Functions:\n")
        context.WriteString(lambdaInfo)
    }

    if strings.Contains(questionLower, "rds") || strings.Contains(questionLower, "database") {
        rdsInfo, err := c.getRDSInfo(ctx)
        context.WriteString("RDS Instances:\n")
        context.WriteString(rdsInfo)
    }
}

Supported service keywords

KeywordServiceData Retrieved
ec2, instanceEC2Instance state, type, IPs
lambda, functionLambdaFunctions, runtime, last modified
rds, databaseRDSDB instances, status, config
s3, bucketS3Buckets, creation dates
ecs, containerECSClusters, services, tasks
iam, roleIAMRoles (names only)
log, cloudwatch, errorCloudWatch LogsLog groups, recent errors

Error log analysis

When asking about errors, Clanker fetches recent error logs:
clanker ask "Show me the last error from my Lambda function"
From ~/workspace/source/internal/aws/client.go:273-284:
if strings.Contains(questionLower, "error") || strings.Contains(questionLower, "last error") {
    errorLogs, err := c.getRecentErrorLogs(ctx, questionLower)
    if err != nil {
        context.WriteString(fmt.Sprintf("Note: Could not fetch recent error logs: %v\n\n", err))
    } else if errorLogs != "" {
        context.WriteString("Recent Error Logs:\n")
        context.WriteString(errorLogs)
    }
}

AI tool calling

When AWS or GitHub context is enabled, Clanker uses AI tool calling for dynamic operations:

Tool-based execution

From ~/workspace/source/cmd/ask.go:1050-1061:
awsProfileForTools := profile
if awsProfileForTools == "" {
    awsProfileForTools = ai.FindInfraAnalysisProfile()
}

response, err := aiClient.AskWithTools(
    ctx, 
    question, 
    awsContext, 
    combinedCodeContext, 
    awsProfileForTools, 
    githubContext
)

Available tools

Tools enable the AI to:
  • Execute AWS CLI commands dynamically
  • Query specific resources by ID/ARN
  • Analyze CloudWatch metrics
  • Inspect security group rules
  • Check IAM permissions

Advanced queries

Multi-service questions

clanker ask "Which Lambda functions connect to which RDS databases?"
Clanker gathers context from multiple services and correlates the data.

Time-based queries

clanker ask "Show me Lambda errors from the last hour"
CloudWatch logs are filtered by time range automatically.

Cost analysis

clanker ask "What's my estimated monthly cost for EC2?"
Uses AWS Cost Explorer data when available.

Profile and region selection

Specify AWS profile

clanker ask "List S3 buckets" --profile prod-account

Use environment-based profiles

Configure environments in ~/.clanker/config.yaml:
infra:
  default_environment: dev
  aws:
    environments:
      dev:
        profile: dev-profile
        region: us-east-1
      prod:
        profile: prod-profile
        region: us-west-2
Then query:
# Uses dev environment (default)
clanker ask "Show EC2 instances"

# Override environment
clanker ask "Show EC2 instances" --profile prod-profile
From ~/workspace/source/cmd/ask.go:664-678:
targetProfile := profile
if targetProfile == "" {
    defaultEnv := viper.GetString("infra.default_environment")
    if defaultEnv == "" {
        defaultEnv = "dev"
    }
    targetProfile = viper.GetString(fmt.Sprintf("infra.aws.environments.%s.profile", defaultEnv))
    if targetProfile == "" {
        targetProfile = viper.GetString("aws.default_profile")
    }
    if targetProfile == "" {
        targetProfile = "default"
    }
}

Discovery mode

Enable comprehensive infrastructure scanning:
clanker ask "What AWS services am I using?" --discovery
Discovery mode:
  • Activates AWS and Terraform contexts
  • Checks all services in parallel
  • Returns resource counts and availability
From ~/workspace/source/cmd/ask.go:528-535:
if discovery {
    includeAWS = true
    includeTerraform = true
    if debug {
        fmt.Println("Discovery mode enabled: AWS and Terraform contexts activated")
    }
}

Routing logic

Clanker uses intelligent routing to determine the target service:

Keyword-based inference

# Routes to AWS
clanker ask "Show EC2 instances"

# Routes to GCP
clanker ask "List Cloud Run services"

# Routes to Cloudflare
clanker ask "Show DNS records"

LLM classification for ambiguous queries

When multiple services are detected, Clanker uses LLM-based routing: From ~/workspace/source/cmd/ask.go:562-583:
if routing.NeedsLLMClassification(svcCtx) {
    if debug {
        fmt.Println("[routing] Ambiguous query detected, using LLM for classification...")
    }
    
    llmService, err := routing.ClassifyWithLLM(context.Background(), routingQuestion, debug)
    if err != nil {
        // Fallback to keyword inference
        if debug {
            fmt.Printf("[routing] LLM classification failed (%v), falling back to keyword inference\n", err)
        }
    } else {
        routing.ApplyLLMClassification(&svcCtx, llmService)
    }
}

Route-only mode (for integrations)

clanker ask "Show S3 buckets" --route-only
Returns JSON routing decision:
{
  "agent": "aws",
  "reason": "Question contains AWS service keyword: s3"
}

AI provider configuration

Configure AI provider

Set your preferred AI provider in ~/.clanker/config.yaml:
ai:
  default_provider: openai
  providers:
    openai:
      api_key: sk-...
      model: gpt-4
    anthropic:
      api_key: sk-ant-...
      model: claude-3-5-sonnet-20241022

Override via flags

# Use specific provider
clanker ask "List EC2 instances" --ai-profile anthropic

# Use specific model
clanker ask "List EC2 instances" --openai-model gpt-4-turbo

Examples

clanker ask "Which EC2 instances are stopped?"
Clanker gathers EC2 context and filters by state.
clanker ask "Show Lambda functions with cold start issues"
Analyzes CloudWatch metrics for initialization duration.
clanker ask "Which security groups allow 0.0.0.0/0 access?"
Scans all security groups and identifies overly permissive rules.
clanker ask "Which resources are using the most expensive instance types?"
Correlates EC2 instance types with pricing data.

Next steps

Maker/apply workflow

Generate and execute infrastructure plans

Service support

See all supported AWS services

Build docs developers (and LLMs) love