Skip to main content

Overview

MCPHost is a command-line MCP client that connects local AI models (via Ollama) to MCP servers. This enables you to interact with Oracle Cloud Infrastructure using locally-running language models without requiring cloud-based AI services.

Prerequisites

Before configuring MCPHost, ensure you have completed:

Installation

1

Download Ollama

Download and install Ollama for your platform:
  • macOS: Download the installer or use brew install ollama
  • Windows: Download the official installer
  • Linux: Follow the installation instructions on the Ollama website
2

Start Ollama Server

Start the Ollama service:macOS:
# If installed via official installer
ollama start

# If installed via homebrew
brew services start ollama
Windows: If installed via the official installer, the server typically starts automatically in the background and on system boot.Linux:
sudo systemctl start ollama
3

Verify Ollama

Verify the Ollama server is running:
curl http://localhost:11434
A successful response will typically be Ollama is running.
4

Pull Language Model

Download a language model that supports tool calling:
ollama pull <model>
Replace <model> with your desired model (e.g., llama3.2, mistral, qwen2.5).For more options, check Ollama’s models that support tool calling.
5

Install Go

Install Go from go.dev if you don’t already have it installed.
6

Install MCPHost

Install MCPHost using Go:
go install github.com/mark3labs/mcphost@latest
7

Add Go Bin to PATH

Ensure Go’s bin directory is in your PATH:
export PATH=$PATH:~/go/bin
Add this to your shell profile (~/.bashrc, ~/.zshrc, etc.) to make it permanent.

Configuration

Create Configuration File

Create an MCPHost configuration file (e.g., ~/.mcphost.json). For more details, see the MCPHost MCP Servers documentation.

Standard Configuration (stdio)

macOS/Linux

{
  "mcpServers": {
    "oracle-oci-api-mcp-server": {
      "type": "stdio",
      "command": "uvx",
      "args": [
        "oracle.oci-api-mcp-server"
      ],
      "env": {
        "OCI_CONFIG_PROFILE": "<profile_name>",
        "FASTMCP_LOG_LEVEL": "ERROR"
      }
    }
  }
}
Replace <profile_name> with the OCI CLI profile you set up during authentication.

Windows

Windows configuration follows the same pattern as macOS/Linux. The main difference is path formatting and environment variable syntax. Refer to the Installation guide for Windows-specific examples.

Podman Configuration

To run Oracle MCP Servers in containers using podman:
{
  "mcpServers": {
    "oracle-oci-api-mcp-server": {
      "type": "stdio",
      "command": "podman",
      "args": [
        "run", "-i", "--rm", 
        "-v", "/path/to/your/.oci:/app/.oci", 
        "oracle.oci-api-mcp-server:latest"
      ],
      "env": {
        "FASTMCP_LOG_LEVEL": "INFO"
      }
    }
  }
}
Important Notes:
  • Replace "/path/to/your/.oci" with the actual path to your OCI configuration directory
  • For servers not requiring OCI credentials, omit the -v volume mount
  • Ensure proper file paths in your OCI config use the ~ character for container compatibility

Starting MCPHost

Start MCPHost with your configuration:
OCI_CONFIG_PROFILE=<profile> mcphost -m ollama:<model> --config <config-path>
Parameters:
  • <profile>: The OCI CLI profile name you set up during authentication
  • <model>: The Ollama model you pulled (e.g., llama3.2)
  • <config-path>: Path to your MCPHost configuration file (e.g., ~/.mcphost.json)

Example

OCI_CONFIG_PROFILE=DEFAULT mcphost -m ollama:llama3.2 --config ~/.mcphost.json

Usage Examples

Once MCPHost is running, you can interact with Oracle Cloud Infrastructure through natural language:
> List all compute instances in my tenancy
> Show me the details of compartment ocid1.compartment.oc1...
> What are the available shapes for compute instances?
The local Ollama model will process your requests and use the Oracle MCP Server tools to interact with OCI.

Multiple Servers

You can configure multiple Oracle MCP Servers in your MCPHost configuration:
{
  "mcpServers": {
    "oracle-oci-api-mcp-server": {
      "type": "stdio",
      "command": "uvx",
      "args": ["oracle.oci-api-mcp-server"],
      "env": {
        "OCI_CONFIG_PROFILE": "DEFAULT",
        "FASTMCP_LOG_LEVEL": "ERROR"
      }
    },
    "oracle-dbtools-mcp-server": {
      "type": "stdio",
      "command": "uvx",
      "args": ["oracle.dbtools-mcp-server"],
      "env": {
        "FASTMCP_LOG_LEVEL": "ERROR"
      }
    }
  }
}

Troubleshooting

Ollama Not Running

Error: connection refused when trying to connect to Ollama Solution:
  • Verify Ollama is running: curl http://localhost:11434
  • Check Ollama service status:
    • macOS (homebrew): brew services list
    • Linux: sudo systemctl status ollama
  • Restart Ollama if needed

Model Not Found

Error: Model not available Solution:
  • Check installed models: ollama list
  • Pull the model: ollama pull <model>
  • Verify the model name matches in your command

MCPHost Command Not Found

Error: mcphost: command not found Solution:
  • Verify Go bin is in PATH: echo $PATH | grep go/bin
  • Add to PATH: export PATH=$PATH:~/go/bin
  • Verify installation: ls ~/go/bin/mcphost

MCP Server Connection Errors

Error: Failed to connect to MCP server Solution:
  • Verify uv is installed and in PATH
  • Check OCI profile exists: cat ~/.oci/config
  • Ensure OCI authentication is valid
  • Set FASTMCP_LOG_LEVEL to DEBUG for detailed logs
  • For podman: Ensure container image is built

Authentication Failures

Error: OCI authentication errors Solution:
  • Verify profile name matches: cat ~/.oci/config
  • Check if session token expired
  • Refresh token: oci session authenticate --profile-name <profile>
  • Ensure OCI_CONFIG_PROFILE environment variable is set correctly

Performance Considerations

Model Selection

Choose models based on your hardware:
  • Small models (7B parameters): Faster, lower memory, suitable for most tasks
  • Medium models (13B parameters): Better accuracy, requires more RAM
  • Large models (70B+ parameters): Best accuracy, requires significant resources

Local vs Cloud AI

Benefits of MCPHost + Ollama:
  • Complete data privacy (everything runs locally)
  • No API costs for AI inference
  • Works offline
  • Faster for simple queries
Limitations:
  • Requires local compute resources
  • May be slower for complex reasoning
  • Model quality varies

Next Steps

Explore Available Servers

Discover all Oracle MCP Servers you can connect to

Ollama Models

Browse available Ollama models

Build docs developers (and LLMs) love