Available Modules
LLM Providers
Marimo supports multiple LLM providers through themo.ai.llm module:
- OpenAI:
mo.ai.llm.openai - Anthropic:
mo.ai.llm.anthropic - Google:
mo.ai.llm.google - Groq:
mo.ai.llm.groq - Bedrock:
mo.ai.llm.bedrock - Pydantic AI:
mo.ai.llm.pydantic_ai
Core Types
ChatMessage
A message in a chat. Attributes:role(Literal[“user”, “assistant”, “system”]): The role of the messagecontent(Any): The content of the message (can be a rich Python object)id(str): The id of the messageparts(list[ChatPart]): Parts from AI SDK Stream Protocol (must be serializable to JSON)attachments(Optional[list[ChatAttachment]]): Optional attachments to the messagemetadata(Any | None): Optional metadata
ChatAttachment
Represents a file attachment in a chat message. Attributes:url(str): The URL of the attachment (can be a hosted file URL or a Data URL)name(str): The name of the attachment, usually the file name (default: “attachment”)content_type(Optional[str]): A string indicating the media type (extracted from pathname extension by default)
ChatModelConfig
Configuration for chat models. Attributes:max_tokens(Optional[int]): Maximum number of tokenstemperature(Optional[float]): Temperature for the model (randomness)top_p(Optional[float]): Restriction on the cumulative probability of prediction candidatestop_k(Optional[int]): Number of top prediction candidates to considerfrequency_penalty(Optional[float]): Penalty for tokens which appear frequentlypresence_penalty(Optional[float]): Penalty for tokens which already appeared at least once