Rate Limit Basics
GitHub enforces different rate limits depending on how you authenticate:| Authentication Method | Requests per Hour | Requests per Minute |
|---|---|---|
| Unauthenticated | 60 | ~1 |
| Authenticated (with token) | 5,000 | ~83 |
| GitHub Apps | 5,000 (per installation) | ~83 |
How GitHub Wrapped Uses the API
Generating a wrapped for a repository requires multiple API calls:Typical API Call Breakdown
For a medium-sized repository (one year of data):- Repository info: 1 request
- Contributors: 1-10 requests (paginated, 100 per page)
- Commits: 1-10 requests (paginated, 100 per page)
- Languages: 1 request
- Issues: 1-5 requests (paginated)
- Pull requests: 1-5 requests (paginated)
- Stargazers: 1-5 requests (paginated)
- Releases: 1 request
Large repositories with thousands of commits or contributors may require significantly more API calls.
Rate Limit Handling in the Code
GitHub Wrapped includes intelligent rate limit handling to prevent errors and provide a better user experience.Rate Limit Checking
The application checks available rate limits before making API calls:Error Handling for Rate Limits
When rate limits are exceeded, the application provides clear error messages:Development Mode Exemption
In development mode (NODE_ENV=development), rate limit checks are relaxed:
Avoiding Rate Limits
1. Add a GitHub Token
The single most effective way to avoid rate limits is to authenticate with a GitHub token:Generate a personal access token
Visit GitHub Settings > Tokens and create a new token.For public repositories, no scopes are needed. For private repos, select the
repo scope.2. Leverage the Built-in Cache
GitHub Wrapped includes a 24-hour cache that dramatically reduces API calls for popular repositories:- First request: Makes API calls and caches the result
- Subsequent requests (within 24 hours): Returns cached data with zero API calls
3. Use Redis for Distributed Caching
For production deployments with multiple serverless instances, configure Redis:4. Implement API Call Pagination Limits
The codebase includes smart pagination limits to prevent excessive API calls:- Contributors: 10 pages (1,000 contributors max)
- Commits: 10 pages (1,000 commits max)
- Issues/PRs: 5 pages (500 items max)
- Stargazers: 5 pages (500 stargazers max)
Monitoring Rate Limits
Check Current Rate Limit Status
You can check your current rate limit status programmatically:Using GitHub’s API Directly
Check your rate limit status via curl:What Happens When You Hit the Limit?
When rate limits are exceeded:-
Error message: Users see a clear error message:
-
Retry-After header: GitHub includes a
X-RateLimit-Resetheader indicating when limits reset -
Automatic retry: Some endpoints in the code handle errors gracefully:
Rate Limit Best Practices
For Development
- Use
NODE_ENV=developmentto bypass strict checks - Add a personal access token to avoid interruptions
- Test with smaller repositories
For Production
- Always use a GitHub token
- Configure Redis for distributed caching
- Monitor rate limit usage
- Consider implementing request queuing for high traffic
Optimization Tips
- Enable caching: Always configure caching (in-memory or Redis) to reduce API calls
- Use conditional requests: GitHub supports ETags for conditional requests (future enhancement)
- Batch requests: The app already uses pagination efficiently
- Monitor usage: Track API usage patterns to identify optimization opportunities
Common Rate Limit Scenarios
Scenario 1: Development testing
Scenario 1: Development testing
Problem: Repeatedly testing the same repository exhausts unauthenticated limitsSolution:
- Add a
GITHUB_TOKENto your.env.local - Development mode already relaxes rate limit checks
- Use the cache - subsequent requests use cached data
Scenario 2: High-traffic production site
Scenario 2: High-traffic production site
Problem: Multiple users generating wrappeds simultaneouslySolution:
- Configure
GITHUB_TOKENfor authenticated requests (5,000/hour) - Set up Redis caching to share cache across instances
- Popular repositories will be served from cache (0 API calls)
Scenario 3: Very large repository
Scenario 3: Very large repository
Problem: Repository has 10,000+ commits and contributorsSolution:
- The app limits pagination to prevent excessive calls
- Enable caching - large repos benefit most from cache
- Consider implementing request queuing or background jobs
Scenario 4: Shared network/IP
Scenario 4: Shared network/IP
Rate Limit Headers
GitHub includes rate limit information in response headers:| Header | Description |
|---|---|
X-RateLimit-Limit | Maximum requests per hour |
X-RateLimit-Remaining | Remaining requests in current window |
X-RateLimit-Reset | Unix timestamp when limit resets |
X-RateLimit-Used | Requests used in current window |
/home/daytona/workspace/source/lib/github.ts:55.
Next Steps
- Configure a GitHub token
- Learn about the caching system
- Complete the setup guide