Skip to main content
Node to Code supports six LLM providers split into two categories: cloud providers that require an API key and send requests over the internet, and local providers that run entirely on your machine with no data leaving your system.

Selecting a provider

Go to Edit → Project Settings → Plugins → Node to Code and set the LLM Provider field to your preferred option. Each provider has its own settings section under Node to Code | LLM Services in the same panel, where you can enter API keys, choose a model, and adjust provider-specific options.

Cloud providers

Cloud providers offer the strongest models available today. They require an account and API key from the respective service, and each request is billed based on token usage.

OpenAI

GPT-4.1, o3, o4 Mini, and more. o4 Mini is recommended for the best price-to-performance ratio.

Anthropic Claude

Claude 4 Sonnet is the default provider and model — strong results across a wide range of Blueprint graphs.

Google Gemini

Gemini 2.5 Flash recommended for price-performance. Gemini 2.5 Pro for maximum capability.

DeepSeek

DeepSeek R1 is a cost-effective reasoning model. Strong results at a competitive price point.

Local providers

Local providers run models directly on your machine. No API key is required and no data is sent to external servers — ideal for proprietary codebases or air-gapped environments.

Ollama

Run open-source models locally via Ollama. Default model: qwen3:32b. Full configuration control including context window, sampling parameters, and more.

LM Studio

Load any GGUF model in LM Studio and connect via its local OpenAI-compatible server. Default endpoint: http://localhost:1234.

Provider comparison

ProviderAPI key requiredRuns locallyDefault model
OpenAIYesNoo4 Mini
AnthropicYesNoClaude 4 Sonnet
GeminiYesNoGemini 2.5 Flash Preview
DeepSeekYesNoDeepSeek R1
OllamaNoYesqwen3:32b
LM StudioNoYesqwen3-32b
If you’re just getting started, Anthropic Claude 4 Sonnet is the default and works well out of the box. If you want zero cost and full privacy, try Ollama or LM Studio.

Build docs developers (and LLMs) love