cyberstrike provider manages custom local or self-hosted model providers. Any server that exposes an OpenAI-compatible /v1 API — such as Ollama, LM Studio, vLLM, or a private proxy — can be registered and used as a model source.
Subcommands
| Subcommand | Aliases | Description |
|---|---|---|
cyberstrike provider add | Add a new custom provider | |
cyberstrike provider list | ls | List configured custom providers |
cyberstrike provider remove <id> | rm | Remove a custom provider |
cyberstrike provider add
Registers a new OpenAI-compatible provider. The command queries the provider’s/v1/models endpoint to discover available models and writes the result to your project or global configuration.
When --name and --url are both provided the command runs non-interactively. Otherwise it prompts for each value.
Flags
Display name for the provider (e.g.
Local Llama). Used as the human-readable label in the TUI and CLI output.Base URL of the OpenAI-compatible endpoint, including
/v1 (e.g. http://192.168.1.201:8000/v1).API key. Leave empty for providers that do not require authentication.
Where to write the provider configuration. Choices:
project, global.project— writes tocyberstrike.jsonin the current project.global— writes to the global CyberStrike configuration directory.
Examples
Output
After registration, the command prints the provider ID and how to reference models from it.cyberstrike provider list
Lists every custom provider configured in the current project. For each provider it shows the ID, display name, API base URL, and the individual model IDs.Example output
cyberstrike provider remove
Removes a custom provider from the project configuration by its ID. The provider ID is shown in the output ofcyberstrike provider list.
Example
Using a custom provider model
Once added, reference models from a custom provider using theprovider-id/model-id format wherever a --model flag is accepted.