Skip to main content

Overview

ChatGPT can access external tools through custom actions and integrations. While ChatGPT doesn’t natively support MCP protocol, you can integrate WebHelp MCP Server through OpenAI’s API or custom GPT actions.

Integration Methods

Method 1: Custom GPT with Actions

Create a Custom GPT that connects to your WebHelp MCP Server through a proxy or OpenAPI interface.
1

Create Custom GPT

Navigate to ChatGPT and create a new Custom GPT:
  1. Go to ChatGPT GPTs
  2. Click “Create a GPT”
  3. Configure the GPT with a name and description
2

Configure Actions

Since MCP uses Server-Sent Events (SSE), you’ll need to create an OpenAPI wrapper or use a proxy that converts MCP to REST API.Example OpenAPI schema for actions:
openapi.yaml
openapi: 3.1.0
info:
  title: WebHelp Documentation Search
  version: 1.0.0
servers:
  - url: https://webhelp-mcp.example.com/api
paths:
  /search:
    post:
      operationId: searchDocumentation
      summary: Search documentation for relevant content
      requestBody:
        required: true
        content:
          application/json:
            schema:
              type: object
              properties:
                site:
                  type: string
                  description: Documentation site path (e.g., www.oxygenxml.com/doc/versions/27.1/ug-editor)
                query:
                  type: string
                  description: Search query string
              required:
                - site
                - query
      responses:
        '200':
          description: Search results
          content:
            application/json:
              schema:
                type: array
                items:
                  type: object
                  properties:
                    title:
                      type: string
                    id:
                      type: string
                    url:
                      type: string
  /fetch:
    post:
      operationId: fetchDocument
      summary: Fetch complete document content by ID
      requestBody:
        required: true
        content:
          application/json:
            schema:
              type: object
              properties:
                site:
                  type: string
                id:
                  type: string
              required:
                - site
                - id
      responses:
        '200':
          description: Document content
          content:
            application/json:
              schema:
                type: object
                properties:
                  title:
                    type: string
                  url:
                    type: string
                  text:
                    type: string
3

Add Action to GPT

  1. In your Custom GPT configuration, go to “Configure” tab
  2. Scroll to “Actions” section
  3. Click “Create new action”
  4. Paste your OpenAPI schema
  5. Configure authentication if needed
4

Test the Integration

Use your Custom GPT and ask it to search documentation:
Search the Oxygen documentation for WSDL validation.

Method 2: OpenAI API with Function Calling

Integrate WebHelp MCP Server into your application using OpenAI’s Function Calling feature.
1

Define Functions

Create function definitions for the OpenAI API:
functions.js
const functions = [
  {
    name: "search_documentation",
    description: "Search documentation for relevant content. Supports boolean operators like AND, OR.",
    parameters: {
      type: "object",
      properties: {
        query: {
          type: "string",
          description: "Search query string"
        },
        site: {
          type: "string",
          description: "Documentation site path (e.g., www.oxygenxml.com/doc/versions/27.1/ug-editor)"
        }
      },
      required: ["query", "site"]
    }
  },
  {
    name: "fetch_document",
    description: "Retrieve complete document content by ID for detailed analysis",
    parameters: {
      type: "object",
      properties: {
        id: {
          type: "string",
          description: "Document ID from search results"
        },
        site: {
          type: "string",
          description: "Documentation site path"
        }
      },
      required: ["id", "site"]
    }
  }
];
2

Implement Function Handlers

Create handlers that call your WebHelp MCP Server:
handlers.js
import { Client } from '@modelcontextprotocol/sdk/client';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';

async function searchDocumentation(site, query) {
  const transport = new StreamableHTTPClientTransport(
    `https://webhelp-mcp.example.com/${site}`
  );
  const client = new Client({ name: 'chatgpt-bridge', version: '1.0.0' });
  
  await client.connect(transport);
  const result = await client.callTool({
    name: 'search',
    arguments: { query }
  });
  
  await client.close();
  return JSON.parse(result.content[0].text);
}

async function fetchDocument(site, id) {
  const transport = new StreamableHTTPClientTransport(
    `https://webhelp-mcp.example.com/${site}`
  );
  const client = new Client({ name: 'chatgpt-bridge', version: '1.0.0' });
  
  await client.connect(transport);
  const result = await client.callTool({
    name: 'fetch',
    arguments: { id }
  });
  
  await client.close();
  return JSON.parse(result.content[0].text);
}
3

Integrate with OpenAI API

Use the functions in your OpenAI API calls:
app.js
import OpenAI from 'openai';

const openai = new OpenAI({
  apiKey: process.env.OPENAI_API_KEY
});

async function chat(messages) {
  const response = await openai.chat.completions.create({
    model: 'gpt-4-turbo',
    messages: messages,
    functions: functions,
    function_call: 'auto'
  });

  const message = response.choices[0].message;

  // Handle function calls
  if (message.function_call) {
    const args = JSON.parse(message.function_call.arguments);
    let functionResponse;

    if (message.function_call.name === 'search_documentation') {
      functionResponse = await searchDocumentation(args.site, args.query);
    } else if (message.function_call.name === 'fetch_document') {
      functionResponse = await fetchDocument(args.site, args.id);
    }

    // Send function response back to GPT
    messages.push(message);
    messages.push({
      role: 'function',
      name: message.function_call.name,
      content: JSON.stringify(functionResponse)
    });

    return await chat(messages);
  }

  return message.content;
}

Example Usage

Custom GPT Example

User: Search for WSDL validation in the Oxygen documentation

GPT: I'll search the Oxygen XML Editor documentation for WSDL validation.
[Calls searchDocumentation action]

I found several relevant documents:

1. **WSDL Documentation** - Comprehensive guide to WSDL validation features
   URL: https://www.oxygenxml.com/doc/versions/27.1/ug-editor/topics/wsdl-content-completion.html

2. **Web Services Support** - Information about WSDL and SOAP support
   URL: https://www.oxygenxml.com/doc/versions/27.1/ug-editor/topics/web-services.html

Would you like me to fetch the full content of any document?

User: Get the first one

GPT: [Calls fetchDocument action]

Here's the detailed information about WSDL validation in Oxygen XML Editor:

[Summary of fetched content]

API Integration Example

const messages = [
  {
    role: 'system',
    content: 'You are a helpful documentation assistant with access to Oxygen XML Editor documentation.'
  },
  {
    role: 'user',
    content: 'How do I validate a WSDL file in Oxygen?'
  }
];

const response = await chat(messages);
console.log(response);

REST API Proxy (Optional)

For easier integration, you can create a REST API proxy that bridges MCP to standard HTTP endpoints:
import express from 'express';
import { Client } from '@modelcontextprotocol/sdk/client';
import { StreamableHTTPClientTransport } from '@modelcontextprotocol/sdk/client/streamableHttp';

const app = express();
app.use(express.json());

const MCP_BASE_URL = 'https://webhelp-mcp.example.com';

app.post('/api/search', async (req, res) => {
  try {
    const { site, query } = req.body;
    const transport = new StreamableHTTPClientTransport(`${MCP_BASE_URL}/${site}`);
    const client = new Client({ name: 'rest-proxy', version: '1.0.0' });
    
    await client.connect(transport);
    const result = await client.callTool({ name: 'search', arguments: { query } });
    await client.close();
    
    res.json(JSON.parse(result.content[0].text));
  } catch (error) {
    res.status(500).json({ error: error.message });
  }
});

app.post('/api/fetch', async (req, res) => {
  try {
    const { site, id } = req.body;
    const transport = new StreamableHTTPClientTransport(`${MCP_BASE_URL}/${site}`);
    const client = new Client({ name: 'rest-proxy', version: '1.0.0' });
    
    await client.connect(transport);
    const result = await client.callTool({ name: 'fetch', arguments: { id } });
    await client.close();
    
    res.json(JSON.parse(result.content[0].text));
  } catch (error) {
    res.status(500).json({ error: error.message });
  }
});

app.listen(3000, () => {
  console.log('REST proxy running on port 3000');
});

Troubleshooting

Solution:
  1. Verify the OpenAPI schema is valid
  2. Test the API endpoints directly with curl or Postman
  3. Check authentication configuration
  4. Review action logs in the GPT configuration
Solution:
  1. Ensure function descriptions are clear and specific
  2. Verify the model supports function calling (GPT-4 or GPT-3.5-turbo)
  3. Check that function_call is set to 'auto' or specific function name
  4. Review the function parameters schema
Solution:
  1. Verify the MCP server URL is correct and accessible
  2. Check network connectivity and firewall rules
  3. Ensure the proxy has the correct dependencies installed
  4. Review proxy logs for detailed error messages
When creating a Custom GPT or using the OpenAI API, be mindful of rate limits and costs associated with API calls and function executions.
For production deployments, implement caching for frequently accessed documents to reduce load on the WebHelp MCP Server and improve response times.

Next Steps

Build docs developers (and LLMs) love