Overview
TheBaseLlm abstract class provides the foundation for all LLM implementations in ADK-TS. It defines the core interface for generating content, managing streaming responses, and handling live connections.
Class Definition
Properties
The name of the LLM model, e.g.,
gemini-2.5-flash or gpt-4.Protected logger instance for debugging and telemetry. Automatically initialized with name
"BaseLlm".Constructor
The model identifier string used to instantiate the LLM.
Static Methods
supportedModels()
Returns a list of regex patterns for model names that this LLM class supports. Used byLlmRegistry for automatic model resolution.
Returns: string[] - Array of regex pattern strings
Default implementation: Returns empty array []
Methods
generateContentAsync()
Generates content from the LLM based on the provided request. Handles both streaming and non-streaming modes with automatic telemetry and error tracking.The request object containing contents, tools, and configuration.
Whether to use streaming mode. When
true, yields multiple responses as they arrive.AsyncGenerator<LlmResponse, void, unknown>
For non-streaming calls, yields one complete response. For streaming calls, yields multiple partial responses that should be merged.
- Tracks token usage (input, output, total)
- Measures time-to-first-token for streaming
- Records chunk count and timing
- Emits OpenTelemetry events for prompts and completions
- Captures finish reasons and error states
generateContentAsyncImpl()
Abstract method - Must be implemented by subclasses to provide the actual LLM API integration.The request object to process.
Whether streaming is enabled.
AsyncGenerator<LlmResponse, void, unknown>
maybeAppendUserContent()
Protected method that ensures proper conversation structure by appending user content when necessary.The request to potentially modify.
- If no contents exist, adds a user message prompting the model to follow system instructions
- If the last message isn’t from the user, appends a continuation prompt
- Prevents empty model responses and maintains conversation flow
connect()
Creates a live bidirectional connection to the LLM for real-time interactions.The initial request configuration for the connection.
BaseLLMConnection
Default implementation: Throws an error indicating live connections are not supported.
Implementation Example
Error Handling
ThegenerateContentAsync method automatically:
- Catches and logs errors with telemetry
- Records error metrics for monitoring
- Rethrows exceptions for caller handling
Related Types
- LlmRequest - Request configuration object
- LlmResponse - Response data structure
- LlmRegistry - Registry for managing LLM implementations
Source Reference
See implementation:/packages/adk/src/models/base-llm.ts