Skip to main content
The bot ships as a Docker image built from a multi-stage Dockerfile. The final image is a minimal Alpine-based Node.js container that runs the compiled bot as a single CJS bundle.

The Dockerfile

The build uses three stages: a shared base layer, a build stage that compiles the bot, and a lean final image that only contains the production output.
Dockerfile
FROM node:24-alpine AS base
ENV NODE_ENV=production
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
RUN corepack enable
RUN apk add --no-cache ripgrep git

FROM base AS build
COPY . /usr/src/app
WORKDIR /usr/src/app
RUN --mount=type=cache,id=/pnpm/store,target=/pnpm/store pnpm install --frozen-lockfile
RUN pnpm run --filter=discord-bot build
RUN pnpm deploy --filter=discord-bot --prod /prod/discord-bot

FROM base
COPY --from=build /prod/discord-bot /prod/discord-bot
WORKDIR /prod/discord-bot
EXPOSE 8080
CMD [ "node", "dist/main.cjs" ]

Stage breakdown

basenode:24-alpine with two system packages added:
  • ripgrep — required by the EffectRepo service for fast code search across the cloned Effect source tree.
  • git — required to clone the Effect repository at runtime.
Without these packages, the bot will fail to start features that depend on local repository access. build — copies the full monorepo, installs all dependencies with a pnpm cache mount for faster rebuilds, then:
  1. Runs tsup via the discord-bot build script to produce dist/main.cjs.
  2. Runs pnpm deploy --prod to produce a self-contained deployment directory at /prod/discord-bot containing only production dependencies.
Final image — copies only /prod/discord-bot from the build stage. This keeps the final image small by discarding dev dependencies, TypeScript sources, and the rest of the monorepo.
Port 8080 is exposed for health check probes (used by Fly.io). The bot itself is a pure Discord Gateway client and does not serve HTTP traffic.

Building the image

Run this from the repository root:
docker build -t effect-discord-bot .
The repository also contains a runner.Dockerfile for a separate runner package (not the Discord bot). This file is unrelated to deploying the bot and can be ignored.

Running the container

Pass required environment variables with -e flags or a .env file:
docker run \
  -e DISCORD_BOT_TOKEN=your_token_here \
  -e OPENAI_API_KEY=your_openai_key_here \
  effect-discord-bot
With optional variables:
docker run \
  -e DISCORD_BOT_TOKEN=your_token_here \
  -e OPENAI_API_KEY=your_openai_key_here \
  -e GITHUB_TOKEN=your_github_token \
  -e DEBUG=true \
  effect-discord-bot
See Environment Variables for the full list of supported variables.

Local observability with Docker Compose

The repository includes a docker-compose.yaml that spins up a local Grafana OTEL stack for development tracing:
docker-compose.yaml
services:
  grafana:
    image: grafana/otel-lgtm:0.11.10
    container_name: grafana
    restart: unless-stopped
    ports:
      - "4000:3000"  # Grafana UI
      - "4318:4318"  # OTLP HTTP receiver
Start it with:
docker compose up -d
The bot will automatically send traces to http://localhost:4318 when HONEYCOMB_API_KEY is not set. Open http://localhost:4000 to view traces in Grafana.

Deploying to Fly.io

The repository includes a fly.toml that targets the iad (us-east) region on a shared-cpu-1x VM with 512 MB of memory:
fly.toml
app = 'effectful-discord-bot'
primary_region = 'iad'

[build]
  dockerfile = 'Dockerfile'

[[vm]]
  size = 'shared-cpu-1x'
  memory = '512mb'
1

Install the Fly CLI

curl -L https://fly.io/install.sh | sh
2

Authenticate

fly auth login
3

Set secrets

Set all required environment variables as Fly secrets so they are never stored in plaintext:
fly secrets set \
  DISCORD_BOT_TOKEN=your_token_here \
  OPENAI_API_KEY=your_openai_key_here
4

Deploy

fly deploy
Fly reads fly.toml and builds the image using the project Dockerfile.
5

Check status

fly status
fly logs
Set HONEYCOMB_API_KEY as a Fly secret to enable production tracing to Honeycomb. Without it, traces are sent to localhost:4318, which is a no-op in production.

Build docs developers (and LLMs) love