Skip to main content
Deploy LangShazam as a Docker container for portability and consistency across environments.

Overview

The Docker deployment uses a multi-stage build with Python 3.9-slim as the base image, includes health checks, and exposes the application on port 10000.

Prerequisites

Docker Installed

Docker Engine 20.10 or higher

OpenAI API Key

Valid API key with Whisper access

Dockerfile

The official Dockerfile is located at backend/deployment/docker/Dockerfile:
FROM python:3.9-slim

WORKDIR /app

# Copy requirements first to leverage Docker cache
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

# Copy source code
COPY src/ /app/src/

# Set environment variables
ENV PYTHONPATH=/app
ENV PORT=10000

# Health check
HEALTHCHECK --interval=30s --timeout=5s --start-period=5s --retries=3 \
    CMD curl -f http://localhost:$PORT/ || exit 1

# Expose port
EXPOSE $PORT

# Run the application with uvicorn
CMD ["python", "-m", "uvicorn", "src.main:app", "--host", "0.0.0.0", "--port", "10000"]

Quick Start

1

Clone the Repository

git clone <your-repo-url>
cd langshazam
2

Build the Docker Image

cd backend
docker build -t langshazam-backend -f deployment/docker/Dockerfile .
The build context is the backend directory, not the docker subdirectory.
3

Run the Container

docker run -d \
  --name langshazam \
  -p 10000:10000 \
  -e OPENAI_API_KEY=your_api_key_here \
  langshazam-backend
4

Verify it's Running

curl http://localhost:10000/
Expected response:
{"message": "Server is running!"}

Configuration Options

Environment Variables

Pass environment variables using -e flag:
docker run -d \
  --name langshazam \
  -p 10000:10000 \
  -e OPENAI_API_KEY=sk-... \
  -e DEBUG=true \
  -e LOGGING_LEVEL=INFO \
  langshazam-backend
See Environment Variables for all available options.

Port Mapping

Map the container port to a different host port:
# Run on port 8080 instead of 10000
docker run -d -p 8080:10000 -e OPENAI_API_KEY=sk-... langshazam-backend

Volume Mounts

Mount logs directory for persistence:
docker run -d \
  --name langshazam \
  -p 10000:10000 \
  -e OPENAI_API_KEY=sk-... \
  -v $(pwd)/logs:/app/logs \
  langshazam-backend

Using Docker Compose

For easier management, use Docker Compose. Create a docker-compose.yml:
version: '3.8'

services:
  langshazam:
    build:
      context: ./backend
      dockerfile: deployment/docker/Dockerfile
    container_name: langshazam
    restart: always
    ports:
      - "10000:10000"
    environment:
      - OPENAI_API_KEY=${OPENAI_API_KEY}
      - DEBUG=false
      - LOGGING_LEVEL=INFO
    healthcheck:
      test: ["CMD", "curl", "-f", "http://localhost:10000/"]
      interval: 30s
      timeout: 5s
      retries: 3
      start_period: 5s
    volumes:
      - app_logs:/app/logs

volumes:
  app_logs:
Then run:
export OPENAI_API_KEY=your_api_key
docker-compose up -d

Health Checks

The container includes built-in health monitoring:
HEALTHCHECK --interval=30s --timeout=5s --start-period=5s --retries=3 \
    CMD curl -f http://localhost:$PORT/ || exit 1
Check health status:
docker ps
# Look for "healthy" in the STATUS column

# Or inspect detailed health:
docker inspect --format='{{json .State.Health}}' langshazam | jq

Container Management

View Logs

# Follow logs in real-time
docker logs -f langshazam

# View last 100 lines
docker logs --tail 100 langshazam

# View logs with timestamps
docker logs -t langshazam

Restart Container

docker restart langshazam

Stop and Remove

docker stop langshazam
docker rm langshazam

Update to Latest Version

# Pull latest code
git pull

# Rebuild image
cd backend
docker build -t langshazam-backend -f deployment/docker/Dockerfile .

# Stop old container
docker stop langshazam
docker rm langshazam

# Start new container
docker run -d --name langshazam -p 10000:10000 -e OPENAI_API_KEY=sk-... langshazam-backend

Production Deployment

Using a Reverse Proxy

For production, run behind nginx or traefik: nginx configuration:
server {
    listen 80;
    server_name api.yourdomain.com;

    location / {
        proxy_pass http://localhost:10000;
        proxy_set_header Host $host;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        
        # WebSocket support
        proxy_http_version 1.1;
        proxy_set_header Upgrade $http_upgrade;
        proxy_set_header Connection "upgrade";
    }
}

Resource Limits

Limit CPU and memory usage:
docker run -d \
  --name langshazam \
  --memory="512m" \
  --cpus="1.0" \
  -p 10000:10000 \
  -e OPENAI_API_KEY=sk-... \
  langshazam-backend

Auto-restart Policy

docker run -d \
  --name langshazam \
  --restart unless-stopped \
  -p 10000:10000 \
  -e OPENAI_API_KEY=sk-... \
  langshazam-backend
Restart policies:
  • no: Don’t restart automatically (default)
  • on-failure: Restart only if container exits with error
  • always: Always restart if stopped
  • unless-stopped: Always restart unless explicitly stopped

Docker Registry

Push to Docker Hub

# Tag the image
docker tag langshazam-backend your-username/langshazam-backend:latest

# Login
docker login

# Push
docker push your-username/langshazam-backend:latest

Push to AWS ECR

# Authenticate
aws ecr get-login-password --region us-east-1 | \
  docker login --username AWS --password-stdin 123456789.dkr.ecr.us-east-1.amazonaws.com

# Tag
docker tag langshazam-backend 123456789.dkr.ecr.us-east-1.amazonaws.com/langshazam:latest

# Push
docker push 123456789.dkr.ecr.us-east-1.amazonaws.com/langshazam:latest

Troubleshooting

Container Won’t Start

Check logs for errors:
docker logs langshazam
Common issues:
  • Missing OPENAI_API_KEY
  • Port 10000 already in use
  • Insufficient memory

Health Check Failing

Enter the container to debug:
docker exec -it langshazam /bin/bash
curl http://localhost:10000/

High Memory Usage

Monitor resource usage:
docker stats langshazam
Set memory limits if needed (see Resource Limits above).

Permission Errors

If you get permission errors with volume mounts:
sudo chown -R 1000:1000 ./logs

Next Steps

Environment Variables

Configure all environment variables

CORS Setup

Configure CORS for your frontend

EC2 Deployment

Deploy to AWS EC2 with Docker Compose

Kubernetes

Scale with Kubernetes

Build docs developers (and LLMs) love