Analytics Overview
The platform offers four main analytics sections:Central Dashboard
Analíticas
Weekly Report
Question Analytics
Central Dashboard
Understanding Key Metrics
Access the Dashboard
Review Executive KPIs
- Trend indicator shows growth/decline
- Compare to previous period
- Typically defined as activity in last 7-30 days
- High active user % indicates good engagement
- Formula: (Users Started / Total Users) × 100
- Target: >70% for successful campaigns
- Shows overall learning effectiveness
- Target: >70% for most dilemmas
- Based on failure rates on critical questions
- Higher = more organizational risk detected
Dashboard Filters
Customize the dashboard view:- Client Filter: Admin users can filter to specific clients
- Date Range: Focus on specific campaign periods
- Dilemma Filter: Isolate specific games
Comprehensive Analytics (Analíticas)
The Analíticas section provides the deepest insights across three main areas.1. Participation & Engagement
Activity Over Time (Line Chart)
Activity Over Time (Line Chart)
- Identify peak activity days (shows when users are most engaged)
- Spot declining trends (may need re-engagement campaign)
- Correlate spikes with internal communications (prove effectiveness)
- Campaigns typically peak 2-3 days after launch
- Reminder emails should trigger visible spikes
- Flat/declining lines indicate need for intervention
Hourly Usage Patterns (Bar Chart)
Hourly Usage Patterns (Bar Chart)
- Identify when users prefer to engage
- Schedule reminder emails before peak hours
- Avoid maintenance during high-traffic periods
- 9-11am: Morning productivity peak
- 1-3pm: Post-lunch engagement
- After 5pm: Minimal activity (users prefer work hours)
Weekday Distribution (Bar Chart)
Weekday Distribution (Bar Chart)
- Understand weekly engagement patterns
- Launch campaigns early in week for maximum reach
- Avoid sending invitations on Fridays
- Tuesday-Thursday: Highest engagement
- Monday: Lower (catch-up from weekend)
- Friday: Lowest (weekend mentality)
- Weekend: Near-zero for B2B contexts
Top Users with Highest Engagement
Top Users with Highest Engagement
- Recognize and reward top performers
- Identify potential champions for future campaigns
- Understand characteristics of highly engaged users (role, department, etc.)
- Send thank-you emails to top users
- Request testimonials
- Ask if they’d promote to colleagues
- Consider them for early access to new dilemmas
2. Learning & Performance
Distribution by Category (Donut Chart)
Distribution by Category (Donut Chart)
- Ensure balanced coverage of topics
- Identify over/under-represented areas
- Plan content creation for gaps
- Balanced distribution = comprehensive training
- Heavy weighting in one category = focused campaign
- Use to demonstrate coverage to clients
Performance by Dilemma (Horizontal Bar Chart)
Performance by Dilemma (Horizontal Bar Chart)
- Compare dilemma effectiveness
- Identify which scenarios drive best learning
- Flag dilemmas that may be too easy/hard
- <50%: Too difficult or poorly designed
- 50-75%: Good challenge level
- 75-90%: Appropriate difficulty
- >90%: May be too easy (not driving learning)
Score Distribution (Pie Chart)
Score Distribution (Pie Chart)
- Understand overall performance spread
- Identify if most users are succeeding or struggling
- Segment follow-up interventions
- Small % in 0-40% (few failing)
- Majority in 60-80% (most learning)
- Some in 80-100% (high performers)
- Large % in 0-40%: Content too hard or not relevant
- Everyone in 80-100%: Content too easy
Competency Map / Radar Chart
Competency Map / Radar Chart
- Identify organizational strengths and weaknesses
- Target training to specific competency gaps
- Track improvement over time by comparing period-over-period
- Unbalanced radar = specific training needs
- Small radar area = overall low competency (need fundamental training)
- Large radar area = strong ethical foundation
Performance by Area / Department
Performance by Area / Department
- Compare departmental performance
- Identify which teams need additional support
- Share successes and best practices across areas
- Is IT content less relevant to their roles?
- Does IT need different dilemmas?
- Can Sales leaders share engagement tactics with IT?
Time by Category (Bar Chart)
Time by Category (Bar Chart)
- Understand which topics are most complex (longer time)
- Identify areas where users rush (may need emphasis)
- Optimize dilemma length by removing time-consuming but low-value questions
- Longer time doesn’t always = better learning
- Very short time may indicate guessing
- Optimal: 30-90 seconds per question
3. Risk Analysis
Top Risk Questions Table
Top Risk Questions Table
- Question: The scenario/question text
- Category: Topic area (Ethics, Compliance, etc.)
- Failures: Number of users who answered incorrectly
- Error Rate: Percentage who got it wrong
- Risk Level: Impact assessment (High/Medium/Low)
- Identify Training Gaps: High error rates = users don’t understand this topic
- Flag Compliance Risks: Critical questions with high failure = real organizational risk
- Improve Content: If error rate >50%, question may be unclear (rewrite or replace)
- Plan Interventions: Create targeted training for high-risk categories
- Generate list of users who failed critical questions
- Require re-training or manager conversation
- Consider policy reminders or process changes
- Track improvement in subsequent campaigns
Using Filters in Analíticas
Maximize insight by combining filters:Set Date Range
- Start Date: Beginning of campaign or analysis period
- End Date: Current date or campaign end
Filter by Client
- Select specific client to isolate their data
- Leave blank to see aggregate across all clients
Filter by Dilemma
- Evaluate individual dilemma performance
- Compare user behavior across different formats (Angel/Demon vs Millonario)
Weekly Report
The Weekly Report section provides an executive-friendly summary perfect for stakeholder communication.Understanding Weekly KPIs
Review Summary Metrics
- Compare to previous weeks to track momentum
- More important than total sessions (user reach)
- Track learning effectiveness week-over-week
- Completion Rate % shows engagement quality
- Prime targets for re-engagement emails
Analyze Daily Activity Chart
- Spikes: Correspond with reminders, deadlines, or promotions
- Valleys: Weekends, holidays, or lack of communication
- Trend: Is activity increasing, stable, or declining?
Review Top Lists
- Shows which content resonates
- Consider promoting similar content
- Recognize engaged organizations
- Learn best practices to share with others
- Track onboarding effectiveness
- Correlate with client expansion efforts
Sending Weekly Reports via Email
Configure Report Period
- Default: Last 7 days
- Custom: Any start/end date range
Preview Report
- See exactly what recipients will receive
- Includes all charts and data tables
- Formatted for easy reading in email clients
Send to Stakeholders
- Click “Enviar por Email” button
- Enter recipient emails (comma-separated):
[email protected], [email protected], [email protected] - Review the summary message
- Click “Enviar”
Customizing Weekly Reports
Best Practices:- Add a personal message summarizing key insights
- Highlight 2-3 main takeaways
- Include specific recommendations or next steps
- Celebrate wins (high scores, increased participation)
- Address concerns proactively (low engagement, risk areas)
Question Analytics (Analítica Transversal)
For granular analysis of individual questions:Apply Filters
- Client: Specific organization
- Dilemma: Specific game
- Difficulty: High/Medium/Low based on performance
- Search: Find questions by text
- Order By: Sort by error rate, attempts, difficulty, etc.
Analyze Question Table
| Column | Meaning | Action Threshold |
|---|---|---|
| Question | Text of the scenario | Look for critical indicator (red dot) |
| Client | Organization | Filter for client-specific analysis |
| Dilemma | Which game | Compare across games |
| Attempts | How many times answered | Low attempts = not enough data |
| Success Rate | % answered correctly | <60% = potential issue |
| Error Rate | % answered incorrectly | >40% = review needed |
| Difficulty | Calculated difficulty level | Track if aligns with intent |
Identify Problematic Questions
- Question may be ambiguous
- Correct answer may be wrong
- Topic may need better explanation
- Users genuinely don’t understand concept
Question Performance Deep Dive
Drilling Down to Individual Responses
Drilling Down to Individual Responses
- Click “Detalle” next to the question
- View all user responses:
- Who answered correctly vs incorrectly
- Which wrong answers were chosen (reveals misconceptions)
- Time spent on question (too fast = guessing)
- Identify patterns:
- All users in specific department failing?
- Certain demographics struggling?
- One wrong answer chosen by everyone? (may be written ambiguously)
- Take action:
- Rewrite question for clarity
- Provide additional training on topic
- Follow up with users who failed
Exporting Data for Advanced Analysis
All major sections offer CSV export:Configure Filters First
- Client(s)
- Date range
- Dilemma(s)
- Specific segments
Click Export Button
- Clientes: Export client metrics
- Usuarios: Export user progress data
- Preguntas: Export question performance
- Analíticas: May export chart data
Open in Spreadsheet Tool
- Microsoft Excel
- Google Sheets
- Data analysis tools (Python, R, Tableau, etc.)
Perform Advanced Analysis
- Pivot Tables: Cross-tab any dimensions (Area × Score, Dilemma × Completion Rate)
- Trend Analysis: Plot time-series of engagement or scores
- Cohort Analysis: Compare user groups (early adopters vs late, high performers vs low)
- Regression: Identify factors predicting success (department, time spent, etc.)
Creating Client-Facing Reports
Monthly Business Review Template
Executive Summary Slide Deck Structure
Executive Summary Slide Deck Structure
- Campaign name and period
- Total users invited vs completed
- Overall completion rate and average score
- Screenshot of dashboard with KPIs
- Participation rate trend (week-by-week)
- Activity over time chart
- Top engaged departments/areas
- Recognition of top performers
- Average scores by dilemma
- Performance by department (bar chart)
- Competency radar showing strengths/gaps
- Score distribution
- Critical questions with high failure rates
- Categories needing attention
- Specific recommendations for follow-up training
- Action plan for next period
- Testimonials from participants
- Behavior change anecdotes
- Certificates issued
- Next steps and future campaigns
Data Storytelling Best Practices
Benchmarking and Targets
Industry Benchmarks
Compare your metrics to typical performance:| Metric | Poor | Average | Excellent |
|---|---|---|---|
| Participation Rate | <50% | 50-70% | >70% |
| Completion Rate | <40% | 40-60% | >60% |
| Average Score | <60% | 60-75% | >75% |
| Time to Complete | >30 min | 15-30 min | <15 min |
| Re-engagement Success | <10% | 10-20% | >20% |
Setting SMART Goals
For each campaign, establish specific targets: Example:- Specific: Achieve 75% completion rate
- Measurable: Track in Usuarios section weekly
- Achievable: Based on 65% in last campaign
- Relevant: Aligns with client’s L&D objectives
- Time-bound: Within 4-week campaign window
Troubleshooting Analytics Issues
Data Not Updating in Real-Time
Data Not Updating in Real-Time
- Browser cache
- Database caching
- Server processing delay
- Hard refresh browser (Ctrl+Shift+R / Cmd+Shift+R)
- Wait 5-10 minutes for batch processing
- Clear browser cache and cookies
- Try in incognito/private window
Charts Not Displaying
Charts Not Displaying
- JavaScript errors
- Chart.js library not loading
- Ad blocker interference
- Check browser console for errors (F12 → Console)
- Disable ad blockers/privacy extensions
- Try different browser
- Ensure JavaScript is enabled
Export CSV Contains Unexpected Data
Export CSV Contains Unexpected Data
- Filters not applied before export
- Encoding issues with special characters
- Re-apply filters and export again
- Open CSV with UTF-8 encoding
- Use Excel’s “Get Data from Text/CSV” with proper encoding