What is MCP?
Model Context Protocol is an open standard that allows AI agents to interact with external systems through a standardized tool interface. Instead of writing custom integrations, MCP provides a universal way for LLMs to invoke functions, query data, and orchestrate workflows.Why Use MCP with Mimir?
Mimir AIP is designed as an agent-first platform. The MCP integration enables:- Natural language control: Configure pipelines, train models, and query digital twins using conversational AI
- Workflow automation: Chain complex operations across multiple Mimir resources
- Reduced integration overhead: Use any MCP-compatible client (Claude Desktop, Claude Code, custom agents) without writing API wrappers
- Observability: Track long-running tasks (ML training, pipeline execution) via work task APIs
- Ontology-driven intelligence: Generate ontologies from text or storage, then use them to structure data and ML models
Architecture
Mimir’s MCP server is embedded in the orchestrator and exposes 55+ tools over a Server-Sent Events (SSE) transport:Tool Categories
Mimir exposes 55 MCP tools organized into 8 categories:| Category | Tool Count | Description |
|---|---|---|
| Projects | 8 | Create, list, update, delete, clone projects; manage component associations (pipelines, ontologies, models, twins, storage) |
| Pipelines | 6 | Define data ingestion/processing/output pipelines; execute them asynchronously |
| Schedules | 5 | Create cron-based triggers to run pipelines on a recurring schedule |
| ML Models | 7 | Define, train, and run inference on decision trees, random forests, regression, or neural networks; get model recommendations |
| Digital Twins | 7 | Create in-memory entity graphs; sync from storage; query with SPARQL |
| Ontologies | 6 | Create/update OWL ontologies; generate from text or extract from storage; diff ontologies |
| Storage | 10 | Configure storage backends (filesystem, PostgreSQL, MySQL, MongoDB, S3, Redis, Elasticsearch, Neo4j); store/retrieve/update/delete CIR records; health checks |
| Tasks | 3 | List queued/running work tasks; poll task status; wait for completion |
| System | 1 | Platform health check |
Common Use Cases
1. Conversational Project Setup
Ask your agent:“Create a new Mimir project called ‘SmartFactory’, then add a PostgreSQL storage backend with connection string ‘postgres://…’, and generate an ontology by extracting entities from that database.”The agent orchestrates:
create_projectcreate_storage_configextract_and_generate_ontology
2. Automated ML Training Pipeline
“Train a random forest model on the ‘sensor-data’ storage for the ‘SmartFactory’ project, then run inference on new data and sync the results to the digital twin.”The agent:
create_ml_model(type:random_forest)train_ml_model→ returns task IDwait_for_task→ polls until training completesrun_inference→ enqueues inference jobsync_digital_twin→ updates entity graph with predictions
3. Scheduled Data Ingestion
“Set up an hourly pipeline that ingests IoT telemetry from the ‘devices’ storage, transforms it with the ‘normalize’ plugin, and outputs to the ‘warehouse’ database.”The agent:
create_pipeline(type:ingestion, steps:[{...}])create_schedule(cron:"0 * * * *", pipeline_ids:[...])
4. Digital Twin Querying
“Show me all sensors in building A that have temperature readings above 80°C in the last hour.”The agent:
query_digital_twinwith SPARQL:
Transport and Protocol
Mimir’s MCP server uses Server-Sent Events (SSE) as the transport layer:- Endpoint:
http://localhost:8080/mcp/sse - Connection: Long-lived HTTP connection with event streaming
- Authentication: Currently unauthenticated (suitable for local/trusted networks); add reverse proxy auth for production
Next Steps
Setup Guide
Configure Claude Code or other MCP clients to connect to Mimir
Tools Reference
Complete documentation of all 55 MCP tools with parameters and examples
Example Workflow
Here’s a complete end-to-end example of using Mimir via MCP:Security Considerations
- Local development: The default SSE endpoint has no authentication and is suitable for localhost development
- Production deployment: Place Mimir behind a reverse proxy (nginx, Traefik) with:
- TLS/SSL termination
- Bearer token or API key authentication
- Rate limiting
- Network isolation: Run Mimir in a private Kubernetes network; expose MCP endpoint only to authorized clients
Limitations
- Stateless: Each MCP tool call is independent; the server does not maintain conversation context
- Long-running operations: Training and pipeline execution return task IDs; use
wait_for_taskor pollget_work_taskfor completion - Bulk operations: Some tools (e.g.,
store_data) accept arrays but have practical limits; for large datasets, use batch pipelines instead - SPARQL complexity: The
query_digital_twintool supports standard SPARQL but queries are executed in-memory; very large graphs may require optimization