Skip to main content
This integration connects xAI’s Grok models to LangChain.

Installation

pip install -U langchain-xai

Setup

Set your xAI API key as an environment variable:
export XAI_API_KEY="your-api-key"
Get your API key from console.x.ai.

Usage

from langchain_xai import ChatXAI

model = ChatXAI(
    model="grok-4",
    temperature=0,
    max_tokens=None,
)

messages = [
    ("system", "You are a helpful assistant."),
    ("human", "What is the capital of France?"),
]

response = model.invoke(messages)
print(response.content)

Streaming

for chunk in model.stream(messages):
    print(chunk.content, end="")

API Reference

ChatXAI

ChatXAI extends ChatOpenAI and inherits all of its parameters. It’s preconfigured to use xAI’s API endpoint.
model
str
required
Name of xAI model to use (e.g., grok-4, grok-3.5, grok-vision-beta).
temperature
float
default:"1"
Sampling temperature between 0 and 2. Higher values mean more random completions, lower values (like 0.2) mean more focused and deterministic completions.
max_tokens
int | None
default:"None"
Maximum number of tokens to generate. Refer to your model’s documentation for limits.
logprobs
bool | None
default:"None"
Whether to return log probabilities of output tokens.
timeout
float | None
default:"None"
Timeout for requests in seconds.
max_retries
int
default:"2"
Maximum number of retries for failed requests.
api_key
str | None
default:"None"
xAI API key. If not provided, reads from XAI_API_KEY environment variable.

Supported Models

  • Grok-4: Latest flagship model with advanced capabilities
  • Grok-3.5: Balanced performance and speed
  • Grok Vision Beta: Multimodal model with vision capabilities
See docs.x.ai/docs/models for the latest model availability and pricing.

Features

  • Text generation
  • Function/tool calling
  • Vision capabilities (Grok Vision Beta)
  • Streaming
  • Async support
  • Real-time information access (via X/Twitter)
Grok models have unique access to real-time information from X (formerly Twitter), which can be useful for current events and trending topics. Refer to xAI’s documentation for API details.

Vision Example

For models with vision capabilities:
from langchain_xai import ChatXAI
from langchain_core.messages import HumanMessage

model = ChatXAI(model="grok-vision-beta")

message = HumanMessage(
    content=[
        {"type": "text", "text": "What's in this image?"},
        {
            "type": "image_url",
            "image_url": {"url": "https://example.com/image.jpg"},
        },
    ],
)

response = model.invoke([message])
print(response.content)

Build docs developers (and LLMs) love