Before a task can consume a static rate limit, you must register the key and its quota with Hatchet. Call hatchet.rate_limits.put() (Python) / hatchet.ratelimits.upsert() (TypeScript) / client.RateLimits().Upsert() (Go) at startup:
Python
TypeScript
Go
worker.py
from hatchet_sdk import Hatchetfrom hatchet_sdk.rate_limit import RateLimitDurationhatchet = Hatchet(debug=True)RATE_LIMIT_KEY = "test-limit"# Register: allow 2 runs per second globallyhatchet.rate_limits.put(RATE_LIMIT_KEY, 2, RateLimitDuration.SECOND)
workflow.ts
import { RateLimitDuration } from '@hatchet/protoc/v1/workflows';import { hatchet } from './hatchet-client';hatchet.ratelimits.upsert({ key: 'api-service-rate-limit', limit: 10, duration: RateLimitDuration.SECOND,});
A dynamic rate limit uses a CEL expression for the key, evaluated against the task’s input at runtime. This lets you enforce a separate quota for each user, tenant, or any other dimension present in the input.When using a dynamic key, you must also supply limit so Hatchet knows the quota for keys it has never seen before.
You can stack several RateLimit entries on a single task. Hatchet checks all of them before dispatching the run — the task waits until every limit has capacity.
A fixed string that identifies the rate limit bucket. All runs of this task share the same bucket. You must register the key and its quota with hatchet.rate_limits.put() before workers start. Mutually exclusive with dynamic_key.
A CEL expression evaluated against the task input at runtime. Each unique result is treated as an independent bucket. For example, "input.user_id" creates a separate limit per user. Requires limit to also be set. Mutually exclusive with static_key.
The quota ceiling for a dynamic rate limit bucket. Required when dynamic_key is set. Can be an integer or a CEL expression string (e.g. "input.limit").