Skip to main content

Objectives

By the end of this lab you will be able to:
  • Install and configure all required development tools
  • Deploy Azure resources needed for the MCP server
  • Set up Docker containers for PostgreSQL and the MCP server
  • Validate your environment setup with test connections
  • Troubleshoot common setup issues and configuration problems

Prerequisites

System requirements

RequirementMinimum
Operating SystemWindows 10/11, macOS, or Linux
RAM8 GB (16 GB recommended)
Storage10 GB free
NetworkInternet connection for downloads and Azure

Step 1: Install Docker Desktop

1

Install Docker

brew install --cask docker
2

Verify installation

docker --version
docker-compose --version

Step 2: Install Azure CLI

1

Install

brew install azure-cli
2

Authenticate

az version
az login
az account list --output table
az account set --subscription "Your-Subscription-Name"

Step 3: Install Python 3.11

brew install [email protected]
Verify:
python --version   # Should show Python 3.11.x
pip --version

Step 4: Install VS Code and extensions

# Install VS Code
# macOS
brew install --cask visual-studio-code
# Windows
winget install Microsoft.VisualStudioCode

# Install required extensions
code --install-extension ms-python.python
code --install-extension ms-vscode.vscode-json
code --install-extension ms-azuretools.vscode-docker
code --install-extension ms-vscode.azure-account

Step 5: Clone the repository

git clone https://github.com/microsoft/MCP-Server-and-PostgreSQL-Sample-Retail.git
cd MCP-Server-and-PostgreSQL-Sample-Retail

Step 6: Create Python virtual environment

python -m venv mcp-env

# Activate — macOS/Linux
source mcp-env/bin/activate

# Activate — Windows
mcp-env\Scripts\activate

python -m pip install --upgrade pip
pip install -r requirements.lock.txt
Verify key packages are installed:
pip list | grep fastmcp
pip list | grep asyncpg
pip list | grep azure

Step 7: Deploy Azure resources

1

Run the deployment script

cd infra
./deploy.sh
The script will:
  1. Create a unique resource group
  2. Deploy Azure AI Foundry resources
  3. Deploy the text-embedding-3-small model
  4. Configure Application Insights
  5. Create a service principal
  6. Generate a .env file with all configuration values
2

Verify the deployment

az group show --name $RESOURCE_GROUP --output table
az resource list --resource-group $RESOURCE_GROUP --output table

Option B: Manual deployment

RESOURCE_GROUP="rg-zava-mcp-$(date +%s)"
LOCATION="westus2"

az group create --name $RESOURCE_GROUP --location $LOCATION

az deployment group create \
  --resource-group $RESOURCE_GROUP \
  --template-file main.bicep \
  --parameters location=$LOCATION \
  --parameters resourcePrefix="zava-mcp"

Required Azure resources and estimated cost

ResourcePurposeEstimated cost
Azure AI FoundryAI model hosting$10–50/month
OpenAI Deploymenttext-embedding-3-small$5–20/month
Application InsightsMonitoring$5–15/month
Resource GroupOrganizationFree

Step 8: Configure environment variables

After deployment, confirm your .env file contains:
# Azure AI / OpenAI
PROJECT_ENDPOINT=https://your-project.cognitiveservices.azure.com/
AZURE_OPENAI_ENDPOINT=https://your-openai.openai.azure.com/
EMBEDDING_MODEL_DEPLOYMENT_NAME=text-embedding-3-small
AZURE_CLIENT_ID=your-client-id
AZURE_CLIENT_SECRET=your-client-secret
AZURE_TENANT_ID=your-tenant-id
APPLICATIONINSIGHTS_CONNECTION_STRING=InstrumentationKey=your-key;...

# Database (development)
POSTGRES_HOST=localhost
POSTGRES_PORT=5432
POSTGRES_DB=zava
POSTGRES_USER=postgres
POSTGRES_PASSWORD=your-secure-password
Never commit your .env file to version control. The repository’s .gitignore should already exclude it.

Step 9: Start the Docker environment

The docker-compose.yml launches PostgreSQL with pgvector and the MCP server:
version: '3.8'
services:
  postgres:
    image: pgvector/pgvector:pg17
    environment:
      POSTGRES_DB: zava
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-secure_password}
    ports:
      - "5432:5432"
    volumes:
      - ./data:/backup_data:ro
      - ./docker-init:/docker-entrypoint-initdb.d:ro

  mcp_server:
    build: .
    depends_on:
      postgres:
        condition: service_healthy
    ports:
      - "8000:8000"
    env_file:
      - .env
docker-compose up -d
docker-compose ps
docker-compose logs -f

Step 10: Configure VS Code MCP integration

Create .vscode/mcp.json:
{
    "servers": {
        "zava-sales-analysis-headoffice": {
            "url": "http://127.0.0.1:8000/mcp",
            "type": "http",
            "headers": {"x-rls-user-id": "00000000-0000-0000-0000-000000000000"}
        },
        "zava-sales-analysis-seattle": {
            "url": "http://127.0.0.1:8000/mcp",
            "type": "http",
            "headers": {"x-rls-user-id": "f47ac10b-58cc-4372-a567-0e02b2c3d479"}
        },
        "zava-sales-analysis-redmond": {
            "url": "http://127.0.0.1:8000/mcp",
            "type": "http",
            "headers": {"x-rls-user-id": "e7f8a9b0-c1d2-3e4f-5678-90abcdef1234"}
        }
    },
    "inputs": []
}
Create .vscode/settings.json:
{
    "python.defaultInterpreterPath": "./mcp-env/bin/python",
    "python.linting.enabled": true,
    "python.testing.pytestEnabled": true,
    "python.testing.pytestArgs": ["tests"]
}

Validate your setup

Automated validation script

# Verify database connectivity
docker-compose exec postgres psql -U postgres -d zava -c "\dt retail.*"
docker-compose exec postgres psql -U postgres -d zava -c "SELECT COUNT(*) FROM retail.stores;"

# Test the MCP server health endpoint
curl http://localhost:8000/health

# Test MCP protocol
curl -X POST http://localhost:8000/mcp \
  -H "Content-Type: application/json" \
  -H "x-rls-user-id: 00000000-0000-0000-0000-000000000000" \
  -d '{"method": "tools/list", "params": {}}'

Manual validation checklist

Basic tools:
  • Docker 20.10+ installed and running
  • Azure CLI 2.40+ installed and authenticated
  • Python 3.8+ with pip installed
  • Git 2.30+ installed
  • VS Code with required extensions
Azure resources:
  • Resource group created successfully
  • AI Foundry project deployed
  • text-embedding-3-small model deployed
  • Application Insights configured
  • Service principal created with proper permissions
Environment:
  • .env file created with all required variables
  • PostgreSQL container running and accessible
  • Sample data loaded in database
VS Code integration:
  • .vscode/mcp.json configured
  • Python interpreter set to virtual environment
  • MCP servers appear in AI Chat

Troubleshooting

docker info
docker system df
docker system prune -f
# Linux: restart the Docker daemon
sudo systemctl restart docker
docker-compose logs postgres
docker-compose ps
docker-compose exec postgres psql -U postgres -d zava -c "SELECT 1;"
az account show
az role assignment list --assignee $(az account show --query user.name -o tsv)
az provider register --namespace Microsoft.CognitiveServices
az provider register --namespace Microsoft.Insights
python -m pip install --upgrade pip setuptools wheel
pip cache purge
pip install fastmcp
pip install asyncpg
pip install azure-ai-projects
# Activate the virtual environment first, then open VS Code
source mcp-env/bin/activate   # macOS/Linux
mcp-env\Scripts\activate      # Windows
code .

Key takeaways

  • All tools installed and configured for development
  • Azure AI and monitoring resources deployed
  • Docker environment running PostgreSQL with pgvector
  • VS Code configured with MCP server connections
  • All components validated and tested together

Next: Lab 4 — Database Design

Explore the retail database schema in detail, understand multi-tenant data modeling, and work with sample data.

Build docs developers (and LLMs) love