Skip to main content

Overview

The LLM API provides endpoints for managing Large Language Model integration. LLM features power ontology generation from natural language and entity extraction from unstructured data.

List Available Models

Get available LLM models from the configured provider.
curl -X GET http://localhost:8080/api/llm/models
Response (LLM enabled):
{
  "provider": "openrouter",
  "enabled": true,
  "models": [
    {
      "id": "anthropic/claude-3-opus",
      "name": "Claude 3 Opus",
      "context_length": 200000
    },
    {
      "id": "openai/gpt-4",
      "name": "GPT-4",
      "context_length": 8192
    }
  ]
}
Response (LLM not configured):
{
  "provider": "none",
  "enabled": false,
  "models": []
}
provider
string
Provider name (openrouter, openai_compat, or none)
enabled
boolean
Whether LLM features are available
models
array
Available model list (cached with TTL)

Configuration

LLM integration is configured via environment variables:
VariableDescriptionExample
LLM_ENABLEDEnable LLM featurestrue
LLM_PROVIDERProvider typeopenrouter or openai_compat
LLM_API_KEYAPI keysk-...
LLM_BASE_URLBase URL (openai_compat only)https://api.together.xyz/v1
LLM_MODELDefault modelanthropic/claude-3-opus

OpenRouter Provider

LLM_ENABLED=true
LLM_PROVIDER=openrouter
LLM_API_KEY=sk-or-v1-...
LLM_MODEL=anthropic/claude-3-opus

OpenAI-Compatible Provider

LLM_ENABLED=true
LLM_PROVIDER=openai_compat
LLM_API_KEY=your-api-key
LLM_BASE_URL=https://api.together.xyz/v1
LLM_MODEL=meta-llama/Llama-3-70b-chat-hf

External LLM Providers

Dynamically install custom LLM provider plugins from Go source repositories.

List External Providers

Get all installed external LLM providers.
curl -X GET http://localhost:8080/api/llm/providers
Response:
[
  {
    "name": "ollama",
    "description": "Local Ollama instance",
    "repository": "https://github.com/example/mimir-llm-ollama",
    "version": "v1.0.0",
    "installed_at": "2026-03-01T10:00:00Z",
    "loaded": true
  }
]
name
string
Provider identifier
loaded
boolean
Whether the provider .so is currently loaded

Install External Provider

Install an external LLM provider from a Git repository.
curl -X POST http://localhost:8080/api/llm/providers \
  -H "Content-Type: application/json" \
  -d '{
    "name": "ollama",
    "repository": "https://github.com/example/mimir-llm-ollama",
    "version": "v1.0.0",
    "description": "Local Ollama instance"
  }'
name
string
required
Unique provider identifier
repository
string
required
Git repository URL. Must export var Provider satisfying the LLM provider interface.
version
string
required
Git tag, branch, or commit SHA
description
string
Human-readable description
Response: 201 Created or 422 Unprocessable Entity (compilation failure)

Get External Provider

Get details for a specific external provider.
curl -X GET http://localhost:8080/api/llm/providers/ollama

Uninstall External Provider

Remove an external LLM provider.
curl -X DELETE http://localhost:8080/api/llm/providers/ollama
Response: 204 No Content
Go plugins cannot be unloaded from memory. An orchestrator restart is required for full removal.

LLM Features

LLM integration powers:
  • Ontology Generation: Generate OWL ontologies from natural language descriptions
  • Entity Extraction: Extract structured entities from unstructured text, documents, and media
  • Schema Inference: Infer entity types and relationships from raw data
See:

Build docs developers (and LLMs) love