DEFAULT_SYSTEM_PROMPT
Default system prompt used to guide the behavior of Large Language Models (LLMs).- Encourage concise, focused responses
- Promote helpful and tactful suggestions
- Foster productive collaboration
- Limit verbose outputs
DEFAULT_STRUCTURED_OUTPUT_PROMPT
Generates a default structured output prompt based on the provided JSON schema.Parameters
structuredOutputSchema(string) - A string representing the JSON schema for the desired output format.
Returns
A prompt string instructing the model to format its output according to the given schema.Example
DEFAULT_MESSAGE_HISTORY
Default message history for Large Language Models (LLMs).DEFAULT_CONTEXT_BUFFER_TOKENS
Default context buffer tokens (number of tokens to keep for the model response) for Large Language Models (LLMs).- The number of tokens reserved for model generation
- Buffer space to prevent context overflow
- Default allocation when not explicitly configured
DEFAULT_CHAT_CONFIG
Default chat configuration for Large Language Models (LLMs).Properties
- systemPrompt - Uses
DEFAULT_SYSTEM_PROMPT - initialMessageHistory - Empty array from
DEFAULT_MESSAGE_HISTORY - contextStrategy - Sliding window strategy with 512 token buffer
Usage
This configuration provides sensible defaults for chat applications:Context Strategy
The default configuration uses a Sliding Window Context Strategy which:- Maintains recent conversation history within token limits
- Automatically truncates older messages when context is full
- Reserves buffer space for model responses
- Ensures system prompt is always included
ContextStrategy interface with your own buildContext method.