Overview
Hazel Chat uses Effect Cluster and Effect Workflow for distributed background job processing. This provides:- Durable execution - workflows survive restarts
- Idempotency - safe to retry operations
- Scalability - distributed across multiple nodes
- Observability - structured logging and tracing
- Type safety - fully typed with Effect-TS
Effect Workflow is like Temporal or AWS Step Functions, but built on Effect-TS with full type safety and composability.
Architecture
Cluster Service Setup
Server Configuration
ClusterWorkflowEngine- Manages workflow executionBunClusterSocket- Shard coordination via WebSocketPgClient- PostgreSQL for message persistenceWorkflowProxyServer- HTTP API for triggering workflows
The cluster service runs on port 3020 by default.
Workflow Definitions
Workflows are defined inpackages/domain/src/cluster/workflows/ and shared between backend and cluster service.
Defining a Workflow
name- Unique workflow identifierpayload- Input schema (validated automatically)error- Error schema for failuresidempotencyKey- Function to extract idempotency key
Activity Schemas
Activities represent individual steps in a workflow:Workflow Implementation
Implementing the Workflow
- Use
Activity.make()for each step - Always provide
successanderrorschemas - Activities can access services via
yield* - Errors are automatically retried based on
retryableflag
Registering Workflows
Durability
Workflow state is persisted in PostgreSQL - survives restarts
Idempotency
Workflows can be safely retried using idempotency keys
Triggering Workflows
From Backend Code
Via HTTP API
Workflows can also be triggered via HTTP:The
id field is the execution ID used for idempotency. Using the same ID will not create duplicate executions.Example Workflows
Message Notification Workflow
Purpose: Creates notifications for new messages based on channel type and mentions. Trigger: After creating a message Activities:GetChannelMembers- Query eligible membersCreateNotifications- Batch insert notifications
- DM/group chats → notify all members
- Regular channels → notify only mentioned users or reply-to author
- Respect mute settings and presence status
Cleanup Uploads Workflow
Purpose: Removes unused file uploads from S3. Trigger: Scheduled cron job (daily) Activities:FindOrphanedUploads- Query uploads not linked to messagesDeleteFromS3- Remove files from object storageDeleteUploadRecords- Clean up database records
GitHub Webhook Workflow
Purpose: Processes GitHub webhook events (issues, PRs, etc.). Trigger: Incoming webhook from GitHub Activities:ParseWebhook- Extract event dataFindSubscriptions- Find channels subscribed to this repoPostMessages- Create messages in subscribed channels
RSS Feed Poll Workflow
Purpose: Polls RSS feeds and posts new items to channels. Trigger: Scheduled cron job (every 5 minutes) Activities:FetchFeed- Download and parse RSS feedFindNewItems- Compare with last pollPostFeedItems- Create messages for new items
Workflow Client
Client Setup
Using the Client
Cron Jobs
Scheduled workflows using Effect Cluster’s cron system:Error Handling & Retries
Retryable Errors
Mark errors as retryable:Manual Retry Logic
Testing Workflows
Mock Workflow Client
Best Practices
Idempotency Keys
Always provide idempotency keys to prevent duplicate executions
Activity Schemas
Define schemas for activity inputs and outputs
Error Types
Mark errors as retryable or non-retryable appropriately
Fire and Forget
Use
Effect.fork to trigger workflows without blockingLogging
Log workflow progress for observability
Database Access
Access database in activities, not in workflow logic
Monitoring
Workflow Status
Check workflow execution status:Logs
Workflow logs are structured and include:- Execution ID
- Activity name
- Duration
- Error details (if failed)
Use OpenTelemetry integration for distributed tracing across workflows and activities.
Next Steps
Effect-TS Patterns
Learn more about Effect-TS patterns
RPC System
Understand the RPC system for triggering workflows