Configure Cline to use Anthropic, OpenRouter, OpenAI, Google Gemini, AWS Bedrock, Azure OpenAI, and GCP Vertex AI.
Cline connects to over 30 cloud providers. This page covers the major ones: what credentials you need, where to get them, and how to configure each in Cline.
If you’re unsure which provider to start with, OpenRouter gives you access to models from all major providers through a single API key — useful for experimenting before committing to a single provider.
Anthropic makes the Claude family of models. Claude Sonnet and Opus have the strongest tool-use reliability of any model family, making Anthropic a popular first choice for Cline users.Website:anthropic.com
OpenRouter is a unified API gateway that gives you access to models from Anthropic, OpenAI, Google, Mistral, Meta, DeepSeek, and many others — all through a single API key. Cline automatically fetches the full model list from OpenRouter.Website:openrouter.ai
Model selection: OpenRouter hosts 100+ models. See openrouter.ai/models for the full list with pricing.
Prompt compression: Enable “Compress prompts and message chains to the context size” in Cline to activate OpenRouter’s middle-out transform, which helps when prompts approach the model’s context limit.
Prompt caching: Automatically passed through to models that support it. For Gemini models specifically, you must manually enable Enable Prompt Caching in Cline’s provider settings.
Rate limits: OpenRouter distributes requests across multiple providers, so you’re less likely to hit per-provider rate limits.
OpenAI provides the GPT-4 and GPT-5 model families, plus reasoning-focused o-series models. If you already have an OpenAI account, this is the most direct way to access GPT models.Website:openai.com
ChatGPT subscription: If you have a ChatGPT Plus, Pro, or Team plan, use the OpenAI Codex provider instead to use your subscription directly — no separate API billing needed.
Azure OpenAI: For enterprise Azure deployments, see the Azure tab.
Google’s Gemini models offer some of the largest context windows available — up to 2M tokens — making them well-suited for large codebases and long documents.Website:ai.google.dev
AWS Bedrock gives you access to Claude, Llama, Mistral, and other models through your AWS account. Useful for teams that need to keep API traffic within AWS infrastructure or want consolidated AWS billing.Website:aws.amazon.com/bedrock
Bedrock uses AWS IAM credentials rather than a simple API key.
1
Enable Bedrock model access
In the AWS Console, go to Amazon Bedrock → Model access and request access to the models you want to use (e.g., Anthropic Claude). Approval is usually instant.
2
Create an IAM user or role
Go to IAM → Users and create a user (or use an existing role). Attach the AmazonBedrockFullAccess policy or a scoped policy that includes bedrock:InvokeModel.
3
Generate access keys
For IAM users: go to the user’s Security credentials tab and click Create access key. Download the access key ID and secret access key.
Enter your AWS Access Key ID and AWS Secret Access Key.
Set your AWS Region (e.g., us-east-1). Bedrock model availability varies by region.
Select your model.
You can also use environment variables (AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, AWS_REGION) or an AWS profile if you prefer not to store credentials in Cline settings.
Region availability: Not all models are available in all regions. Check AWS Bedrock model availability before selecting a region.
Pricing: You pay AWS directly based on token usage. Prices are roughly equivalent to going direct, sometimes with a small markup. See aws.amazon.com/bedrock/pricing.
Azure OpenAI requires a deployment rather than selecting a model directly.
1
Create an Azure OpenAI resource
In the Azure Portal, create an Azure OpenAI resource in a supported region.
2
Deploy a model
In Azure AI Foundry (formerly Azure OpenAI Studio), create a deployment for a model (e.g., gpt-4o). Note the deployment name you choose — this is what you’ll enter in Cline.
3
Get your endpoint and API key
In the Azure Portal, go to your Azure OpenAI resource → Keys and Endpoint. Copy your Endpoint URL and Key 1.
GCP Vertex AI gives you access to Claude (via Anthropic’s Vertex integration) and Google’s Gemini models within your Google Cloud infrastructure.Website:cloud.google.com/vertex-ai
Provide your service account credentials — either paste the JSON key contents or set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of your key file.