Overview
twitter-cli supports exporting all data to JSON format for scripting, data analysis, and offline workflows. You can also load previously exported JSON files back into the CLI for filtering and display.Exporting to JSON
Using the --json Flag
Add --json to any read command to output structured JSON instead of the default terminal format:
Saving to File with --output / -o
Use the --output (or -o) flag to save JSON directly to a file:
Redirecting Output
You can also use shell redirection with--json:
Importing from JSON
Using the --input Flag
Load previously exported JSON files to re-display or re-filter them:
- Offline browsing of previously fetched data
- Re-applying filters without making new API calls
- Sharing datasets with teammates
- Testing filter configurations
JSON Schema
Tweet Object
Tweets are serialized using thetweet_to_dict() function in serialization.py:
User Profile Object
User profiles are serialized usinguser_profile_to_dict():
Scripting Examples
Using jq for Processing
Python Processing
Data Pipeline
Implementation Details
The JSON serialization is handled bytwitter_cli/serialization.py:
tweets_to_json()- Serializes list of Tweet objects to pretty JSONtweets_from_json()- Deserializes JSON string back to Tweet objectstweet_to_dict()- Converts single Tweet to dictionarytweet_from_dict()- Converts dictionary to Tweet objectusers_to_json()- Serializes user profiles to JSONuser_profile_to_dict()- Converts UserProfile to dictionary
ensure_ascii=Falseto preserve Unicode charactersindent=2for readable formatting- Consistent camelCase field names for JavaScript interoperability
Best Practices
- Use
-ofor large exports - More reliable than shell redirection for large datasets - Save raw data first - Export to JSON before applying filters so you can re-filter later
- Version your exports - Include timestamps in filenames for tracking
- Validate JSON - Use
jqor Python to validate exported files before processing - Compress archives - Use
gzipfor long-term storage:gzip feed_2024-01-15.json
