Skip to main content

Overview

Prompt templates allow you to create reusable prompts with variable substitution. LangChain provides templates for both string-based prompts and chat-based prompts.

BasePromptTemplate

Base class for all prompt templates. Source: langchain_core.prompts.base:39 Inherits: RunnableSerializable[dict, PromptValue]

Type Parameters

FormatOutputType
TypeVar
The output type when formatting the prompt

Properties

input_variables
list[str]
required
List of variable names required as inputs to the prompt
optional_variables
list[str]
default:"[]"
List of optional variable names (auto-inferred from placeholders)
input_types
dict[str, Any]
default:"{}"
Dictionary mapping variable names to their expected types. Defaults to str for all.
output_parser
BaseOutputParser | None
default:"None"
Optional output parser to apply to the LLM response
partial_variables
Mapping[str, Any]
default:"{}"
Partial variables that are pre-filled
metadata
dict[str, Any] | None
default:"None"
Metadata for tracing
tags
list[str] | None
default:"None"
Tags for tracing

Methods

invoke

def invoke(
    self,
    input: dict,
    config: RunnableConfig | None = None
) -> PromptValue
Format the prompt with input variables.
input
dict
required
Dictionary of variable values
return
PromptValue
Formatted prompt value

format_prompt

def format_prompt(self, **kwargs: Any) -> PromptValue
Format the prompt with keyword arguments.

partial

def partial(self, **kwargs: Any) -> BasePromptTemplate
Return a partial prompt template with some variables pre-filled.
**kwargs
Any
required
Variables to pre-fill
return
BasePromptTemplate
New prompt template with partial variables

PromptTemplate

String prompt template using Python’s str.format syntax. Source: langchain_core.prompts.prompt Inherits: BasePromptTemplate

Properties

template
str
required
The template string with {variable} placeholders
template_format
Literal['f-string', 'jinja2', 'mustache']
default:"'f-string'"
Format of the template string
validate_template
bool
default:"False"
Whether to validate the template on initialization

Constructor

@classmethod
def from_template(
    cls,
    template: str,
    *,
    template_format: str = "f-string",
    partial_variables: dict[str, Any] | None = None,
    **kwargs: Any
) -> PromptTemplate
Create a prompt template from a template string.
template
str
required
Template string with variables in {variable} format
template_format
str
default:"'f-string'"
Template format ('f-string', 'jinja2', or 'mustache')
partial_variables
dict | None
Variables to pre-fill

Example

from langchain_core.prompts import PromptTemplate

prompt = PromptTemplate.from_template(
    "Tell me a joke about {topic}"
)
prompt.invoke({"topic": "cats"})  # PromptValue
prompt.format(topic="dogs")  # "Tell me a joke about dogs"

ChatPromptTemplate

Template for chat-based prompts using messages. Source: langchain_core.prompts.chat Inherits: BasePromptTemplate

Properties

messages
list[MessageLike]
required
List of message templates or placeholders

Constructor

@classmethod
def from_messages(
    cls,
    messages: Sequence[MessageLike | tuple[str, str]]
) -> ChatPromptTemplate
Create a chat prompt template from a list of messages.
messages
Sequence
required
List of message templates. Each can be:
  • A tuple like ("human", "Hello {name}")
  • A BaseMessage instance
  • A BaseMessagePromptTemplate
  • A MessagesPlaceholder

Example

from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    MessagesPlaceholder("history"),
    ("human", "{input}")
])

result = prompt.invoke({
    "history": [("human", "Hi"), ("ai", "Hello!")],
    "input": "What's the weather?"
})

Class Methods

from_template

@classmethod
def from_template(
    cls,
    template: str,
    *,
    role: str = "human",
    **kwargs: Any
) -> ChatPromptTemplate
Create a chat prompt with a single message.
template
str
required
Template string
role
str
default:"'human'"
Role of the message ('human', 'ai', 'system', etc.)

MessagesPlaceholder

Placeholder for inserting a list of messages. Source: langchain_core.prompts.chat:52 Inherits: BaseMessagePromptTemplate

Properties

variable_name
str
required
Name of the variable containing messages
optional
bool
default:"False"
Whether the placeholder is optional. If True, returns empty list when variable not provided.
n_messages
PositiveInt | None
default:"None"
Maximum number of messages to include. If None, includes all.

Constructor

def __init__(
    self,
    variable_name: str,
    *,
    optional: bool = False,
    **kwargs: Any
)
variable_name
str
required
Name of variable containing messages
optional
bool
default:"False"
Whether the variable is optional

Example

from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    MessagesPlaceholder("chat_history"),
    ("human", "{question}")
])

prompt.invoke({
    "chat_history": [
        ("human", "What's 2+2?"),
        ("ai", "4")
    ],
    "question": "What's 3+3?"
})

HumanMessagePromptTemplate

Template for creating human messages. Source: langchain_core.prompts.message Inherits: BaseMessagePromptTemplate

Properties

prompt
StringPromptTemplate
required
The underlying string prompt template

Constructor

@classmethod
def from_template(
    cls,
    template: str,
    *,
    template_format: str = "f-string",
    **kwargs: Any
) -> HumanMessagePromptTemplate

AIMessagePromptTemplate

Template for creating AI messages. Source: langchain_core.prompts.message Inherits: BaseMessagePromptTemplate

SystemMessagePromptTemplate

Template for creating system messages. Source: langchain_core.prompts.message Inherits: BaseMessagePromptTemplate

FewShotPromptTemplate

Template for few-shot learning with examples. Source: langchain_core.prompts.few_shot Inherits: BasePromptTemplate

Properties

examples
list[dict] | None
List of example dictionaries
example_selector
BaseExampleSelector | None
Selector to choose examples dynamically
example_prompt
PromptTemplate
required
Template for formatting each example
suffix
str
required
Suffix template appended after examples
prefix
str
default:"''"
Prefix template prepended before examples
example_separator
str
default:"'\\\\n\\\\n'"
String separator between examples

Example

from langchain_core.prompts import FewShotPromptTemplate, PromptTemplate

example_prompt = PromptTemplate.from_template(
    "Question: {question}\nAnswer: {answer}"
)

prompt = FewShotPromptTemplate(
    examples=[
        {"question": "What's 2+2?", "answer": "4"},
        {"question": "What's 3+3?", "answer": "6"}
    ],
    example_prompt=example_prompt,
    suffix="Question: {input}\nAnswer:",
    input_variables=["input"]
)

prompt.format(input="What's 4+4?")

PipelinePromptTemplate

Compose multiple prompt templates in a pipeline. Source: langchain_core.prompts.pipeline Inherits: BasePromptTemplate

Properties

final_prompt
BasePromptTemplate
required
Final prompt template to format
pipeline_prompts
list[tuple[str, BasePromptTemplate]]
required
List of (name, prompt) tuples to execute in order

ImagePromptTemplate

Template for creating prompts with images. Source: langchain_core.prompts.image Inherits: BasePromptTemplate

Properties

template
str | dict
required
Image URL template or image data template

Example

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.prompts.image import ImagePromptTemplate

prompt = ChatPromptTemplate.from_messages([
    ("human", [
        {"type": "text", "text": "What's in this image?"},
        {"type": "image", "source": {"type": "url", "url": "{image_url}"}}
    ])
])

prompt.invoke({"image_url": "https://example.com/image.jpg"})

Build docs developers (and LLMs) love