Ask questions about your cluster in plain English:
# Basic queriesclanker k8s ask "how many pods are running"clanker k8s ask "show me all deployments"clanker k8s ask "list services in default namespace"# Resource usageclanker k8s ask "which pods are using the most memory"clanker k8s ask "show me node CPU usage"clanker k8s ask "what's the cluster utilization"# Troubleshootingclanker k8s ask "why is my nginx pod failing"clanker k8s ask "show me error logs for the api pod"clanker k8s ask "tell me the health of my cluster"
# Use cluster name and AWS profileclanker k8s ask --cluster my-cluster --profile myaws "how many nodes"# Works with current kubeconfig contextclanker k8s ask "show me all pods"
Ask complex questions that require multiple operations:
clanker k8s ask "find the pod using the most memory and show its logs"clanker k8s ask "tell me which deployment has failing pods and why"clanker k8s ask "compare CPU usage between production and staging namespaces"
# First questionclanker k8s ask "show me all pods in default namespace"# Follow-up (remembers we're talking about default namespace)clanker k8s ask "which one is using the most memory"# Another follow-upclanker k8s ask "show me its logs"
var k8sAskCmd = &cobra.Command{ Use: "ask [question]", Short: "Ask natural language questions about your Kubernetes cluster", Long: `Ask natural language questions about your Kubernetes cluster using AI.The AI will analyze your question, determine what kubectl operations are needed,execute them, and provide a comprehensive markdown-formatted response.Conversation history is maintained per cluster for follow-up questions.Examples: clanker k8s ask "how many pods are running" clanker k8s ask --cluster test-cluster "show me all deployments" clanker k8s ask "which pods are using the most memory" clanker k8s ask "why is my pod crashing"`, Args: cobra.ExactArgs(1), RunE: runK8sAsk,}
The AI decision flow is handled by the LLM integration: