Request
No parameters required.Example Request
Response
Returns a JSON object with available models.Array of model identifiers in
provider/model-name format. The specific models available depend on the configured harness and provider.Examples:openai/gpt-4oopenai/gpt-4o-minianthropic/claude-3-5-sonnet-20241022anthropic/claude-3-5-haiku-20241022
The default model used when a model is not specified in chat requests. Only included if:
- A default model is configured on the server
- The default model exists in the
modelsarray
Example Response
Example Response (No Default)
Configuration
The default model is set via theDEFAULT_MODEL environment variable:
Notes
- The available models depend on the configured harness and provider
- Default harness uses the Zen provider which routes to multiple LLM providers
- Custom harnesses can be configured to support different model sets
- Model format follows the pattern:
provider/model-name
