Rate Limiting Middleware
TherateLimit middleware prevents API abuse by limiting the number of requests a client can make within a specific time window. It uses an in-memory rate limiter that tracks requests by IP address and returns appropriate rate limit headers with each response.
Installation
Import the middleware from@middlewares/rate-limit and wrap your API handler:
TypeScript Signature
The API route handler function to be rate-limited. Receives an
APIContext and returns a Promise<Response>.Configuration
Default Configuration
If no options are provided, the middleware uses a default rate limiter:- Points: 1000 requests
- Duration: 60 seconds (1 minute)
Custom Configuration
You can customize the rate limits per endpoint:Example Usage
Basic Rate Limiting
src/pages/api/animes/random.ts
Combined with Other Middlewares
Rate limiting can be combined with other middlewares like Redis connection:src/pages/api/animes/full.ts
When combining middlewares, apply
rateLimit as the outermost wrapper to ensure rate limiting happens before any other processing.Response Headers
The middleware automatically adds rate limit headers to successful responses:| Header | Description | Example |
|---|---|---|
X-RateLimit-Limit | Maximum number of requests allowed | 100 |
X-RateLimit-Remaining | Number of requests remaining in current window | 87 |
X-RateLimit-Reset | Seconds until the rate limit resets | 45 |
Example Response
Error Handling
Rate Limit Exceeded (429)
When a client exceeds their rate limit, the middleware returns:Error message indicating too many requests.
Number of seconds the client should wait before making another request.
Internal Server Error (500)
If the rate limiter encounters an unexpected error:How It Works
- IP Tracking: The middleware extracts the client’s IP address from
context.clientAddress - Point Consumption: Each request consumes one point from the client’s quota
- Header Injection: Rate limit headers are added to the response
- Limit Enforcement: When points are exhausted, returns 429 with retry information
Rate Limiter Strategy
The middleware usesrate-limiter-flexible with an in-memory storage strategy:
- Algorithm: Token bucket
- Granularity: Per IP address
- Storage: In-memory (RateLimiterMemory)
- Reset: Automatic after duration window expires
Since the rate limiter uses in-memory storage, limits are per-instance. In a distributed environment, consider using
RateLimiterRedis for shared rate limiting across multiple servers.Best Practices
Choose Appropriate Limits
Informative Client Handling
Always check rate limit headers on the client side:Security Considerations
Reverse Proxy: If running behind a reverse proxy (nginx, Cloudflare), ensure the real client IP is properly forwarded through headers like
X-Forwarded-For.Related
- Authentication Middleware - Protect endpoints with session validation
- API Endpoints - View all available API endpoints
