What is MCP?
The Model Context Protocol is an open standard for connecting AI agents to external data sources and tools. Fenic’s MCP integration allows you to:- Expose Fenic tables as queryable tools with automatic schema generation
- Create custom tools from DataFrame operations
- Serve context via HTTP or stdio for local and remote access
- Auto-generate CRUD operations (Schema, Profile, Read, Search, Analyze) for any table
Quick Start
1. Install MCP Support
2. Create a Basic MCP Server
3. Connect from an Agent
Configure your MCP client (Claude Desktop, Cline, etc.) to connect tohttp://127.0.0.1:8000.
Architecture
Auto-Generated System Tools
When you specifytable_names, Fenic automatically generates six tools for each table:
| Tool | Description | Use Case |
|---|---|---|
| Schema | List columns and types | First step in exploring data |
| Profile | Statistics per column (min/max/mean/distinct/top values) | Understand data distribution |
| Read | Read rows with filtering and pagination | Sample data or simple queries |
| Search Summary | Regex search across all tables, return match counts | Find which tables contain relevant data |
| Search Content | Return matching rows from a specific table | Get actual rows matching a pattern |
| Analyze | Execute SQL (SELECT-only) across tables | Complex queries, aggregations, JOINs |
Example: Using Auto-Generated Tools
Custom User-Defined Tools
Create custom tools from DataFrame queries usingsession.catalog.create_tool:
Tool with Multiple Parameters
Real-World Example: Documentation Server
Fromexamples/mcp_server/docs-server/server.py:
Transport Options
HTTP Transport (Recommended for Development)
Stdio Transport (CLI Integration)
Semantic Operations in Tools
Combine Fenic’s semantic capabilities with MCP tools:Advanced Configuration
Concurrency Control
Result Limits
Control maximum results returned by tools:Table Descriptions
MCP tools use table descriptions to help agents understand data:Error Handling
Production Deployment
ASGI/Uvicorn
Docker
MCP Client Configuration Examples
- Claude Desktop
- Cline/VSCode
- Stdio (CLI)
Best Practices
Performance
- Use
result_limitto cap tool results and reduce token usage - Index frequently queried columns in your tables
- Consider caching for expensive semantic operations
- Use
concurrency_limitto prevent resource exhaustion
Security
- Run servers on localhost or behind authentication
- Validate all tool inputs (Fenic validates automatically)
- Use read-only tools when possible
- Set appropriate
max_result_limitto prevent data leaks
Agent Design
- Provide clear tool descriptions to guide agents
- Use table descriptions to explain data semantics
- Start agents with Schema/Profile tools before querying
- Design tools for specific use cases rather than generic access
Troubleshooting
Server Won’t Start
Tools Not Appearing
Table Not Found
Next Steps
Agent Frameworks
Integrate Fenic with LangGraph, PydanticAI, and more
LLM Providers
Configure semantic operations with OpenAI, Anthropic, etc.
