Overview
Pricing Intelligence can be deployed in multiple ways depending on your needs:Docker Compose
Recommended: Production-ready deployment with all services
Local Development
Individual service development with hot reloading
Kubernetes
Scalable cloud deployment (advanced)
System Requirements
Hardware
- CPU: 4+ cores recommended (CSP service is compute-intensive)
- RAM: 8GB minimum, 16GB recommended
- Storage: 5GB for Docker images and dependencies
Software
- Docker: 24.0+ with Compose plugin
- Node.js: 20+ (for local Analysis API development)
- Python: 3.11+ (for local Harvey/MCP/A-MINT development)
- Java: 17+ (for local CSP service development)
Production Deployment (Docker Compose)
Step 1: Clone the Repository
Step 2: Configure Environment Variables
The system requires OpenAI API keys for two services:- Linux/macOS
- Windows (PowerShell)
- .env file
~/.bashrc or ~/.zshrc:~/.bashrc
Step 3: Configure Services
Editdocker-compose.yml to customize service configurations:
Change the OpenAI model
Change the OpenAI model
Modify the Harvey API service:
docker-compose.yml
Adjust port mappings
Adjust port mappings
Change the host port (left side) if ports are already in use:
docker-compose.yml
Configure logging levels
Configure logging levels
Adjust verbosity for debugging:
docker-compose.yml
Enable persistent storage
Enable persistent storage
Mount volumes to preserve data between restarts:
docker-compose.yml
Step 4: Launch the Platform
- Foreground (development)
- Background (production)
Run in the foreground to see live logs:Stop with
Ctrl+C.Step 5: Verify Deployment
Check that all services are healthy:Local Development Setup
Develop individual services with hot reloading:Harvey API (Python)
Run with hot reload
MCP Server (Python)
Analysis API (Node.js)
Run in development mode
A-MINT API (Python)
Run the service
For detailed A-MINT configuration, visit the A-MINT repository.
CSP Service (Java)
Frontend (React)
Run development server
Environment Variables Reference
Harvey API
| Variable | Default | Description |
|---|---|---|
OPENAI_API_KEY | - | Required: OpenAI API key |
OPENAI_MODEL | gpt-5-nano | Model to use for Harvey agent |
MCP_SERVER_URL | http://mcp-server:8085/sse | MCP server endpoint |
MCP_TRANSPORT | sse | Transport protocol: sse or stdio |
HARVEY_STATIC_DIR | /app/static | Directory for uploaded files |
LOG_LEVEL | INFO | Logging verbosity |
CACHE_BACKEND | memory | Cache backend: memory or redis |
MCP Server
| Variable | Default | Description |
|---|---|---|
AMINT_BASE_URL | http://a-mint-api:8000 | A-MINT API endpoint |
ANALYSIS_BASE_URL | http://analysis-api:3000 | Analysis API endpoint |
CACHE_BACKEND | memory | Cache backend: memory or redis |
LOG_LEVEL | INFO | Logging verbosity |
HTTP_HOST | 0.0.0.0 | Bind address |
HTTP_PORT | 8085 | Server port |
MCP_TRANSPORT | sse | Transport protocol |
Analysis API
| Variable | Default | Description |
|---|---|---|
NODE_ENV | production | Environment: development or production |
PORT | 3000 | Server port |
CHOCO_API | http://choco-api:8000 | CSP service endpoint |
LOG_LEVEL | INFO | Logging level |
A-MINT API
| Variable | Default | Description |
|---|---|---|
OPENAI_API_KEY | - | Required: OpenAI API key |
OPENAI_API_KEYS | - | Optional: Comma-separated list for load balancing |
ANALYSIS_API | http://analysis-api:3000/api/v1 | Analysis API endpoint |
PYTHONPATH | /app | Python module search path |
PORT | 8000 | Server port |
LOG_LEVEL | INFO | Logging verbosity |
CSP Service
| Variable | Default | Description |
|---|---|---|
PORT | 8000 | Server port |
LOG_LEVEL | INFO | Logging level |
Advanced Configuration
Using Redis for Caching
For improved performance in multi-instance deployments:Scaling Services
Scale specific services for higher load:You’ll need a load balancer (e.g., nginx) to distribute traffic across scaled instances.
Custom Volume Mounts
Persist data and enable live code editing:docker-compose.yml
Network Configuration
Create a custom network for service isolation:docker-compose.yml
Troubleshooting
Port already in use
Port already in use
Find the process using the port:Kill the process or change the port mapping in
docker-compose.yml.Container fails to build
Container fails to build
Clear Docker cache and rebuild:
Service unhealthy after startup
Service unhealthy after startup
Check service logs:Common issues:
- Missing environment variables
- Dependency services not ready
- Insufficient memory (increase Docker memory limit)
OpenAI API errors
OpenAI API errors
MCP connection failures
MCP connection failures
Ensure MCP server is reachable:Check network connectivity between containers:
CSP solver timeouts
CSP solver timeouts
Increase timeout limits in Analysis API:Or allocate more CPU to the CSP service:
docker-compose.yml
docker-compose.yml
Production Considerations
Security
Use secrets management
Never hardcode API keys. Use Docker secrets or external secret managers:
docker-compose.yml
Monitoring
- Health Checks
- Logging
- Metrics
Configure comprehensive health checks:
docker-compose.yml
Backup & Recovery
Regularly backup persistent data:Next Steps
Harvey API
Explore the Harvey API and chat interface
Architecture
Deep dive into system design and component interactions
Pricing Models
Learn the Pricing2Yaml data format
Basic Usage
Learn how to use the chat interface