Overview
Pricing Intelligence is configured through environment variables in Docker Compose. This guide covers all configuration options for each service and how to customize the deployment.Environment Variables
Core API Keys
Two OpenAI API keys are required:OpenAI API key for the H.A.R.V.E.Y. agent service. Used for the ReAct agent reasoning loop.Set in
.env:OpenAI API key for the A-MINT extraction service. Used for extracting pricing from URLs.Set in
.env:You can use the same API key for both services or separate keys for better cost tracking and rate limit isolation.
Alternative: Multiple Keys
Comma-separated list of OpenAI API keys for load balancing or failover.
Service Configuration
Harvey API
The H.A.R.V.E.Y. agent service:docker-compose.yml
OpenAI model to use for the agent. Options:
gpt-4o- Best quality, slower, more expensivegpt-4o-mini- Good balance of speed and qualitygpt-3.5-turbo- Fastest, cheapest, lower qualitygpt-5-nano- Custom/preview model (if available)
Logging verbosity:
DEBUG, INFO, WARNING, ERRORCache backend:
memory or redisMCP transport protocol:
sse (Server-Sent Events) or stdioMCP Server
The Model Context Protocol server:docker-compose.yml
Bind address for the HTTP API
Port for the HTTP/SSE endpoint
Uvicorn server host
Uvicorn server port
A-MINT API
The pricing extraction service:docker-compose.yml
Internal service port (exposed as 8001 externally)
Base URL for the Analysis API (with version prefix)
Analysis API
The Node.js analysis service:docker-compose.yml
Node environment:
development or productionBase URL for the Choco CSP solver service
CSP Service (Choco)
The Java-based constraint solver:docker-compose.yml
Service port
Frontend
The React/Vite frontend:docker-compose.yml
Harvey API endpoint (must be accessible from the browser)
Optional: Sphere integration endpoint for advanced pricing extraction
Changing OpenAI Models
For H.A.R.V.E.Y. Agent
Editdocker-compose.yml:
Model Recommendations
GPT-4o
Best for:
- Complex reasoning
- Multi-step optimization
- Accurate feature extraction
- Slower responses
- Higher costs
GPT-4o-mini
Best for:
- Balanced performance
- Most production use cases
- Cost-effective operation
- Slightly lower accuracy
GPT-3.5-turbo
Best for:
- Development/testing
- Simple queries
- Budget-conscious deployments
- May miss nuances
- Less reliable reasoning
Cache Configuration
The MCP server and Harvey API support two cache backends:Memory Cache (Default)
- Simple in-memory cache
- No external dependencies
- Cache cleared on service restart
- Suitable for single-instance deployments
Redis Cache
Port Configuration
All service ports can be customized:docker-compose.yml
When changing ports, update:
- Service URLs in environment variables
VITE_API_BASE_URLin frontend build args- Internal service references (e.g.,
CHOCO_API,ANALYSIS_BASE_URL)
Volume Mounts
Persist logs and output:Health Checks
All services include health checks for monitoring:Restart Policies
All services userestart: unless-stopped:
Service Dependencies
Services start in the correct order based on dependencies:depends_on ensures dependent services start first, but doesn’t wait for them to be “ready”. Health checks provide readiness detection.MCP Environment Variables
The MCP server supports additional configuration:Python module to launch (for stdio transport)
Path to Python binary (optional)
Additional PYTHONPATH entries (colon-separated)
Complete .env Example
Create a.env file in the project root:
.env
Troubleshooting Configuration
Missing API Keys
Problem: Services fail to start or return authentication errors Solution:Port Conflicts
Problem: “Port already in use” errors Solution:Service Can’t Reach Dependencies
Problem: Harvey API can’t connect to MCP server Solution:- Use Docker service names, not
localhost:http://mcp-server:8085 - Verify services are on the same Docker network
- Check health checks:
docker-compose ps
Cache Not Working
Problem: Repeated URL extractions despite caching Solution:Next Steps
Basic Usage
Start using the H.A.R.V.E.Y. chat interface
Architecture
Learn how the platform components work together