Prerequisites
Before you begin, ensure you have:- Node.js 18+ installed
- npm, yarn, or bun package manager
- A Supabase account (free tier works)
- (Optional) A Scrape.do account for live social media data
Installation
Set Up Environment Variables
Copy the example environment file and configure your credentials:Edit
.env and add your Supabase credentials (required):.env
Minimum Configuration: Only Supabase credentials are required to start. You can add Scrape.do, YouTube, and AI service keys later for full functionality.
Set Up Supabase Database
Run the database migration to create all required tables:
- Go to your Supabase project dashboard
- Navigate to SQL Editor
- Copy the contents of
supabase/migrations/FULL_MIGRATION_RUN_IN_DASHBOARD.sql - Paste and execute the SQL
Your First Analysis
Now let’s analyze a topic to see SENTi-radar in action:Open the Dashboard
Navigate to
http://localhost:8080 in your browser. You’ll see the main dashboard with a search bar.Search for a Topic
Enter a topic, hashtag, or brand name in the search bar. For example:
climate change#techiPhone 15
Wait for Analysis
The system will:
- Fetch posts from available sources (YouTube by default, X/Reddit if Scrape.do is configured)
- Analyze emotions using keyword-based classification
- Generate an AI summary (if Gemini/Groq API keys are configured, otherwise uses local analysis)
- Calculate sentiment scores and crisis levels
Review Results
Once complete, you’ll see:
- Sentiment Gauge: Overall positive/negative/mixed sentiment (-100 to +100)
- Emotion Breakdown: Distribution across 6 emotions (joy, anger, sadness, fear, surprise, disgust)
- AI Insights: Strategic summary with key takeaways
- Live Feed: Recent posts with sentiment labels
- Crisis Alerts: Any detected spikes or volatility warnings
Add Live Data Sources
To unlock full functionality with real-time social media scraping:Scrape.do (Recommended)
Get live X (Twitter) and Reddit data:- Sign up at scrape.do (free tier available)
- Get your API token from the dashboard
- Add to
.env:
.env
- Restart the dev server
Scrape.do provides residential proxies and JavaScript rendering for reliable scraping. See Scrape.do Integration for advanced configuration.
YouTube Data API
Enable YouTube comment analysis:- Go to Google Cloud Console
- Create a project and enable YouTube Data API v3
- Create an API key
- Add to
.env:
.env
AI Services (Optional)
Add Gemini or Groq for enhanced AI summaries:- Gemini (Google)
- Groq (Secondary)
.env
Fallback: If no AI keys are configured, SENTi-radar uses a local keyword-based summary generator as a fallback.
Deploy Supabase Edge Functions (Optional)
For production use, deploy the backend edge functions:Troubleshooting
Build errors or missing dependencies
Build errors or missing dependencies
Clear cache and reinstall:
Supabase connection errors
Supabase connection errors
Verify your
VITE_SUPABASE_URL and VITE_SUPABASE_PUBLISHABLE_KEY are correct. Check that you’ve run the database migration.No posts found
No posts found
- If using Scrape.do, verify your token is valid and has remaining credits
- Try a different topic (some topics may have limited content)
- Enable YouTube API for broader coverage
AI summary not generating
AI summary not generating
- Verify Gemini/Groq API keys are correct
- Check API quotas haven’t been exceeded
- The local fallback should generate a basic summary even without AI keys
Next Steps
Architecture Overview
Understand how the system works
Environment Variables
Complete configuration reference
Analyzing Topics
Learn advanced analysis techniques
Deployment Guide
Deploy to production