Overview
Twitter/X employs sophisticated anti-bot detection mechanisms. Following these best practices helps avoid account restrictions, rate limits, and IP bans.Anti-Detection Strategies
Use Browser Cookie Extraction (Recommended)
twitter-cli automatically extracts all cookies from your browser, not justauth_token and ct0. This provides a complete browser fingerprint that makes requests indistinguishable from real browser traffic.
- Includes session cookies that validate browser state
- Preserves tracking cookies that Twitter expects
- Matches the exact cookie set from your browser session
- Reduces detection risk compared to partial auth
TLS Fingerprint Matching
twitter-cli usescurl_cffi to impersonate real Chrome TLS fingerprints. The client automatically:
- Matches the Chrome version installed on your system
- Generates authentic TLS handshakes
- Sends correct cipher suites and extensions
- Mimics browser HTTP/2 settings
Request Timing Jitter
The client adds randomized delays between requests to avoid pattern detection:Transaction ID Generation
Every request includes a dynamically generatedx-client-transaction-id header that mimics real browser behavior. This is handled automatically.
Proxy Configuration
Why Use a Proxy?
- IP Protection: Avoid exposing your real IP to Twitter’s rate limiting
- Geographic Distribution: Rotate IPs to simulate natural access patterns
- Ban Mitigation: If one IP gets rate limited, switch to another
- Residential Safety: Residential IPs are far less likely to be flagged than datacenter IPs
Setting Up a Proxy
Proxy Types Comparison
| Proxy Type | Detection Risk | Cost | Speed | Recommended |
|---|---|---|---|---|
| Residential | Very Low | High | Medium | ✅ Yes |
| Mobile | Very Low | High | Medium | ✅ Yes |
| Datacenter | High | Low | Fast | ❌ No |
| Free Public | Very High | Free | Slow | ❌ Never |
Proxy Rotation Strategy
Rate Limit Avoidance
Keep Request Volumes Low
- Feed: 20-50 tweets per request
- Search: 30-100 tweets per request
- User posts: 20-50 tweets per request
- Frequency: Maximum 2-3 requests per minute
Configure Rate Limit Settings
- Add jitter to
requestDelay(×0.7-1.5) - Exponentially back off on 429 errors
- Stop at
maxCounteven if more data is available
Write Operation Delays
Write operations (post, like, retweet, etc.) automatically include random delays:Avoid Frequent Startups
Each CLI startup fetchesx.com to initialize anti-detection headers. Frequent startups can trigger rate limits.
Account Safety
Don’t Use Your Main Account
If you’re doing aggressive scraping or automation:- Create a dedicated throwaway account
- Use it exclusively for API/CLI access
- Never post personal content from it
- If it gets banned, your main account is safe
Avoid Datacenter IPs
Twitter flags datacenter IP ranges aggressively:Login Hygiene
- Always login from your real browser first before using twitter-cli
- Complete any security challenges (CAPTCHA, email verification) in the browser
- Wait 5-10 minutes after login before running CLI commands
- Don’t switch IPs rapidly between browser and CLI usage
Common Pitfalls
Why am I getting 401/403 errors?
Why am I getting 401/403 errors?
Your cookies have expired or your account is temporarily restricted.Solution:
- Re-login to x.com in your browser
- Complete any security challenges
- Wait 5 minutes
- Retry your twitter-cli command
Why do I get 429 Too Many Requests?
Why do I get 429 Too Many Requests?
You’ve hit Twitter’s rate limit for your IP or account.Solution:
- Reduce
--maxvalues (use 20-50 instead of 100+) - Increase delays between requests in
config.yaml - Use a proxy to rotate IPs
- Wait 15-30 minutes before retrying
Why do commands fail with 404 errors?
Why do commands fail with 404 errors?
Twitter’s GraphQL query IDs rotate periodically, causing the client’s hardcoded IDs to become invalid.Solution:
- Retry the command - the client attempts live queryId fallback
- Update twitter-cli to the latest version:
uv tool upgrade twitter-cli - If errors persist, file an issue at https://github.com/jackwener/twitter-cli/issues
Is it safe to share my cookies?
Is it safe to share my cookies?
Can I use twitter-cli in production?
Can I use twitter-cli in production?
Use with caution. Twitter’s Terms of Service prohibit automated access without API keys.Safer use cases:
- Personal research and data analysis
- One-time data exports
- Local development and testing
- Accessing your own account data
- High-volume scraping (1000+ tweets/hour)
- Commercial data reselling
- Automated posting at scale
- Bypassing Twitter’s official API
What happens if my account gets banned?
What happens if my account gets banned?
Twitter may:
- Temporarily lock your account (requires password reset)
- Permanently suspend for repeated violations
- Rate limit your IP (temporary, usually 15 minutes to 24 hours)
- Use a throwaway account for CLI access
- Never exceed 100-200 requests per hour
- Always use residential proxies for bulk operations
- Add 5-10 second delays between operations
- For temporary locks: Reset password via email
- For IP rate limits: Switch proxy or wait 24 hours
- For permanent bans: Create new account, follow best practices
Configuration Example
Completeconfig.yaml for safe, production-grade usage:
Summary Checklist
- ✅ Use browser cookie extraction (not manual env vars)
- ✅ Configure a residential or mobile proxy
- ✅ Keep
--maxvalues under 50 for most operations - ✅ Add 2-5 second delays between requests
- ✅ Use a throwaway account for heavy automation
- ✅ Avoid datacenter IPs (AWS, GCP, Azure, etc.)
- ✅ Wait 5-10 minutes after browser login before CLI usage
- ✅ Monitor for 429 errors and slow down immediately
- ✅ Update twitter-cli regularly for latest anti-detection
- ❌ Never share raw cookie values
- ❌ Never scrape 500+ tweets in a single session
- ❌ Never run CLI commands every few seconds in a loop
