list
List all inference providers.Maximum number of items to return per page
Pagination cursor for fetching the next page
List of inference provider objects
Cursor for the next page, or None if no more results
get
Retrieve a specific inference provider by UID.Unique identifier of the inference provider
Unique identifier for the inference provider
Name of the inference provider
Type of provider (e.g., “openai”, “anthropic”, “custom”)
Provider-specific configuration settings
Whether the provider is currently active
create
Create a new inference provider configuration.Name for this inference provider
Type of provider (e.g., “openai”, “anthropic”, “azure”, “custom”)
Provider-specific configuration object (API keys, endpoints, model parameters)
Optional description of this provider configuration
Whether this provider should be active (default: true)
UID of the project to associate this provider with
Unique identifier for the inference provider
Name of the inference provider
Type of provider
Provider configuration
update
Update an existing inference provider.Unique identifier of the inference provider to update
Updated name for the provider
Updated description
Updated provider type
Updated configuration object
Updated active status
Updated project association
Unique identifier for the inference provider
Updated name
Provider type
Updated configuration
delete
Delete an inference provider.Unique identifier of the inference provider to delete
test
Test an inference provider to verify connectivity and configuration.Unique identifier of the inference provider to test
Whether the test was successful
Test result message or error details
Response latency in milliseconds