Skip to main content

Overview

This lesson demonstrates how to build a real-world AI agent that integrates with external APIs, handles diverse data types, manages errors, and orchestrates multiple tools. You will build a complete web search MCP server and client using SerpAPI.

general_search

Broad web results across all domains

news_search

Recent headlines and news articles

product_search

E-commerce product data

qna

Direct question-and-answer snippets

Prerequisites

  • Python 3.8 or higher
  • A SerpAPI API key (sign up at serpapi.com — free tier available)

Setup

1

Install dependencies

# Using uv (recommended)
uv pip install -r requirements.txt

# Or using pip
pip install -r requirements.txt
2

Configure API key

Create a .env file in your project root:
SERPAPI_KEY=your_serpapi_key_here
3

Start the server

python server.py
4

Run the client

# Automated tests
python client.py

# Interactive mode
python client.py --interactive

Server implementation

The server registers four tools and handles all requests from the MCP client.
# server.py (excerpt)
from mcp.server import MCPServer, Tool

async def general_search(query: str):
    # SerpAPI call, response parsing, error handling
    ...

server = MCPServer()
server.add_tool(Tool("general_search", general_search))

if __name__ == "__main__":
    server.run()

Tool reference

Performs a general web search and returns formatted results. Parameters:
  • query (string, required): The search query
Example call:
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

async def run_general_search():
    server_params = StdioServerParameters(
        command="python",
        args=["server.py"],
    )
    async with stdio_client(server_params) as (reader, writer):
        async with ClientSession(reader, writer) as session:
            await session.initialize()
            result = await session.call_tool(
                "general_search",
                arguments={"query": "latest AI trends"}
            )
            print(result)
Example request:
{"query": "latest AI trends"}

Searches for recent news articles. Parameters:
  • query (string, required): The search query
Example call:
result = await session.call_tool(
    "news_search",
    arguments={"query": "AI policy updates"}
)
Example request:
{"query": "AI policy updates"}

Searches for products matching a query. Parameters:
  • query (string, required): The product search query
Example call:
result = await session.call_tool(
    "product_search",
    arguments={"query": "best AI gadgets 2025"}
)
Example request:
{"query": "best AI gadgets 2025"}

qna

Gets a direct answer to a question from search engine results. Parameters:
  • question (string, required): The question to find an answer for
Example call:
result = await session.call_tool(
    "qna",
    arguments={"question": "what is artificial intelligence"}
)
Example request:
{"question": "what is artificial intelligence"}

Writing a custom test script

from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client

async def test_custom_query():
    server_params = StdioServerParameters(
        command="python",
        args=["server.py"],
    )

    async with stdio_client(server_params) as (reader, writer):
        async with ClientSession(reader, writer) as session:
            await session.initialize()

            result = await session.call_tool(
                "general_search",
                arguments={"query": "your custom query"}
            )
            print(result)

Advanced concepts

Running multiple tools (web search, news search, product search, Q&A) within a single MCP server allows the AI to handle a variety of tasks in one interaction.
External APIs limit request frequency. Implement exponential backoff and quota checks so your app degrades gracefully rather than crashing.
SerpAPI responses are deeply nested. Flatten them into clean, consistent structures before returning to the AI model.
Network failures and API errors should produce useful messages rather than stack traces. Wrap all external calls in try/except with specific error types.
Validate all tool inputs before making API calls. Set sensible defaults and ensure required fields are present.

Debugging

Enable DEBUG logging to see detailed HTTP request/response traces:
# At the top of client.py or server.py
import logging
logging.basicConfig(
    level=logging.DEBUG,  # Change from INFO to DEBUG
    format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
)
Normal output:
2025-06-01 10:15:23 - INFO - Calling general_search: {'query': 'open source LLMs'}
2025-06-01 10:15:24 - INFO - Successfully called general_search

GENERAL_SEARCH RESULTS:
...
Debug output:
2025-06-01 10:15:23 - INFO  - Calling general_search: {'query': 'open source LLMs'}
2025-06-01 10:15:23 - DEBUG - HTTP Request: GET https://serpapi.com/search ...
2025-06-01 10:15:23 - DEBUG - HTTP Response: 200 OK ...
2025-06-01 10:15:24 - INFO  - Successfully called general_search

Troubleshooting

Create a .env file in your project root and add SERPAPI_KEY=your_key_here. Make sure python-dotenv or a similar package loads it at startup.
Run pip install -r requirements.txt to install all required packages. Common missing modules: httpx, mcp, python-dotenv.
Verify server.py is running before starting the client. Check that both are using compatible MCP SDK versions.
Your SerpAPI key is missing, incorrect, or expired. Verify the key in your SerpAPI dashboard and update your .env file. Check if your free-tier quota is exhausted.

Build docs developers (and LLMs) love