Overview
MCPHost is a command-line MCP client that connects local AI models (via Ollama) to MCP servers. This enables you to interact with Oracle Cloud Infrastructure using locally-running language models without requiring cloud-based AI services.Prerequisites
Before configuring MCPHost, ensure you have completed:- Quick Start setup
- OCI Authentication (for OCI servers)
- Installed
uvand Python 3.13
Installation
Download Ollama
Download and install Ollama for your platform:
- macOS: Download the installer or use
brew install ollama - Windows: Download the official installer
- Linux: Follow the installation instructions on the Ollama website
Start Ollama Server
Start the Ollama service:macOS:Windows:
If installed via the official installer, the server typically starts automatically in the background and on system boot.Linux:
Verify Ollama
Verify the Ollama server is running:A successful response will typically be
Ollama is running.Pull Language Model
Download a language model that supports tool calling:Replace
<model> with your desired model (e.g., llama3.2, mistral, qwen2.5).For more options, check Ollama’s models that support tool calling.Install Go
Install Go from go.dev if you don’t already have it installed.
Configuration
Create Configuration File
Create an MCPHost configuration file (e.g.,~/.mcphost.json). For more details, see the MCPHost MCP Servers documentation.
Standard Configuration (stdio)
macOS/Linux
<profile_name> with the OCI CLI profile you set up during authentication.
Windows
Windows configuration follows the same pattern as macOS/Linux. The main difference is path formatting and environment variable syntax. Refer to the Installation guide for Windows-specific examples.
Podman Configuration
To run Oracle MCP Servers in containers using podman:Starting MCPHost
Start MCPHost with your configuration:<profile>: The OCI CLI profile name you set up during authentication<model>: The Ollama model you pulled (e.g.,llama3.2)<config-path>: Path to your MCPHost configuration file (e.g.,~/.mcphost.json)
Example
Usage Examples
Once MCPHost is running, you can interact with Oracle Cloud Infrastructure through natural language:Multiple Servers
You can configure multiple Oracle MCP Servers in your MCPHost configuration:Troubleshooting
Ollama Not Running
Error:connection refused when trying to connect to Ollama
Solution:
- Verify Ollama is running:
curl http://localhost:11434 - Check Ollama service status:
- macOS (homebrew):
brew services list - Linux:
sudo systemctl status ollama
- macOS (homebrew):
- Restart Ollama if needed
Model Not Found
Error: Model not available Solution:- Check installed models:
ollama list - Pull the model:
ollama pull <model> - Verify the model name matches in your command
MCPHost Command Not Found
Error:mcphost: command not found
Solution:
- Verify Go bin is in PATH:
echo $PATH | grep go/bin - Add to PATH:
export PATH=$PATH:~/go/bin - Verify installation:
ls ~/go/bin/mcphost
MCP Server Connection Errors
Error: Failed to connect to MCP server Solution:- Verify
uvis installed and in PATH - Check OCI profile exists:
cat ~/.oci/config - Ensure OCI authentication is valid
- Set
FASTMCP_LOG_LEVELtoDEBUGfor detailed logs - For podman: Ensure container image is built
Authentication Failures
Error: OCI authentication errors Solution:- Verify profile name matches:
cat ~/.oci/config - Check if session token expired
- Refresh token:
oci session authenticate --profile-name <profile> - Ensure
OCI_CONFIG_PROFILEenvironment variable is set correctly
Performance Considerations
Model Selection
Choose models based on your hardware:- Small models (7B parameters): Faster, lower memory, suitable for most tasks
- Medium models (13B parameters): Better accuracy, requires more RAM
- Large models (70B+ parameters): Best accuracy, requires significant resources
Local vs Cloud AI
Benefits of MCPHost + Ollama:- Complete data privacy (everything runs locally)
- No API costs for AI inference
- Works offline
- Faster for simple queries
- Requires local compute resources
- May be slower for complex reasoning
- Model quality varies
Next Steps
Explore Available Servers
Discover all Oracle MCP Servers you can connect to
Ollama Models
Browse available Ollama models
