Skip to main content
Dashboard Dilemas provides comprehensive analytics to measure engagement, learning outcomes, and organizational risk. This guide shows you how to leverage these insights effectively.

Analytics Overview

The platform offers four main analytics sections:

Central Dashboard

High-level KPIs and real-time metrics for quick status checks

Analíticas

Deep-dive analytics with charts, trends, and performance breakdowns

Weekly Report

Executive summary of weekly activity, perfect for stakeholder updates

Question Analytics

Granular analysis of individual questions to identify learning gaps

Central Dashboard

Understanding Key Metrics

1

Access the Dashboard

From the sidebar, click on the Dashboard or home icon to view the central dashboard.
2

Review Executive KPIs

The dashboard displays 4-6 key performance indicators:Total Users: Number of registered participants
  • Trend indicator shows growth/decline
  • Compare to previous period
Active Users: Users who logged in and engaged recently
  • Typically defined as activity in last 7-30 days
  • High active user % indicates good engagement
Participation Rate: Percentage of invited users who started
  • Formula: (Users Started / Total Users) × 100
  • Target: >70% for successful campaigns
Average Performance: Mean score across all sessions
  • Shows overall learning effectiveness
  • Target: >70% for most dilemmas
Risk Index: Percentage indicating areas of concern
  • Based on failure rates on critical questions
  • Higher = more organizational risk detected
KPIs update in real-time as users complete sessions. Check daily during active campaigns.
3

Spot Trends at a Glance

Use the dashboard to quickly identify:
  • Sudden drops in participation (investigate cause)
  • Score improvements over time (learning is happening)
  • High risk index (focus on specific questions/categories)

Dashboard Filters

Customize the dashboard view:
  • Client Filter: Admin users can filter to specific clients
  • Date Range: Focus on specific campaign periods
  • Dilemma Filter: Isolate specific games
For client reporting, filter to their organization and take a screenshot of the dashboard. It’s a quick, visual way to show progress.

Comprehensive Analytics (Analíticas)

The Analíticas section provides the deepest insights across three main areas.

1. Participation & Engagement

What It Shows: Daily session volume over the selected date rangeHow to Use:
  • Identify peak activity days (shows when users are most engaged)
  • Spot declining trends (may need re-engagement campaign)
  • Correlate spikes with internal communications (prove effectiveness)
Insights:
  • Campaigns typically peak 2-3 days after launch
  • Reminder emails should trigger visible spikes
  • Flat/declining lines indicate need for intervention
If you see a spike on a specific day, check what happened that day (email sent, leadership endorsement, deadline reminder, etc.) and repeat the tactic.
What It Shows: Number of sessions by hour of day (0-23)How to Use:
  • Identify when users prefer to engage
  • Schedule reminder emails before peak hours
  • Avoid maintenance during high-traffic periods
Typical Patterns:
  • 9-11am: Morning productivity peak
  • 1-3pm: Post-lunch engagement
  • After 5pm: Minimal activity (users prefer work hours)
Business dilemmas are rarely completed outside work hours, unlike consumer gaming. Time communications accordingly.
What It Shows: Sessions by day of week (Monday-Sunday)How to Use:
  • Understand weekly engagement patterns
  • Launch campaigns early in week for maximum reach
  • Avoid sending invitations on Fridays
Typical Patterns:
  • Tuesday-Thursday: Highest engagement
  • Monday: Lower (catch-up from weekend)
  • Friday: Lowest (weekend mentality)
  • Weekend: Near-zero for B2B contexts
What It Shows: Top 5 users by number of sessions completedHow to Use:
  • Recognize and reward top performers
  • Identify potential champions for future campaigns
  • Understand characteristics of highly engaged users (role, department, etc.)
Action Items:
  • Send thank-you emails to top users
  • Request testimonials
  • Ask if they’d promote to colleagues
  • Consider them for early access to new dilemmas

2. Learning & Performance

What It Shows: Percentage of questions by category (Ethics, Compliance, etc.)How to Use:
  • Ensure balanced coverage of topics
  • Identify over/under-represented areas
  • Plan content creation for gaps
Insights:
  • Balanced distribution = comprehensive training
  • Heavy weighting in one category = focused campaign
  • Use to demonstrate coverage to clients
What It Shows: Average score for each dilemmaHow to Use:
  • Compare dilemma effectiveness
  • Identify which scenarios drive best learning
  • Flag dilemmas that may be too easy/hard
Benchmarks:
  • <50%: Too difficult or poorly designed
  • 50-75%: Good challenge level
  • 75-90%: Appropriate difficulty
  • >90%: May be too easy (not driving learning)
A dilemma with 100% average score isn’t necessarily good—it may be too easy to provide real learning value.
What It Shows: How users’ scores are distributed across ranges (0-20%, 20-40%, etc.)How to Use:
  • Understand overall performance spread
  • Identify if most users are succeeding or struggling
  • Segment follow-up interventions
Ideal Distribution:
  • Small % in 0-40% (few failing)
  • Majority in 60-80% (most learning)
  • Some in 80-100% (high performers)
Red Flags:
  • Large % in 0-40%: Content too hard or not relevant
  • Everyone in 80-100%: Content too easy
What It Shows: Performance across different skill areas (Ethics, Decision-Making, Compliance, etc.)How to Use:
  • Identify organizational strengths and weaknesses
  • Target training to specific competency gaps
  • Track improvement over time by comparing period-over-period
Insights:
  • Unbalanced radar = specific training needs
  • Small radar area = overall low competency (need fundamental training)
  • Large radar area = strong ethical foundation
What It Shows: Average score segmented by organizational areaHow to Use:
  • Compare departmental performance
  • Identify which teams need additional support
  • Share successes and best practices across areas
Example Use Case: If Sales scores 85% but IT scores 62%, investigate:
  • Is IT content less relevant to their roles?
  • Does IT need different dilemmas?
  • Can Sales leaders share engagement tactics with IT?
What It Shows: Average time users spend on each category of questionsHow to Use:
  • Understand which topics are most complex (longer time)
  • Identify areas where users rush (may need emphasis)
  • Optimize dilemma length by removing time-consuming but low-value questions
Insights:
  • Longer time doesn’t always = better learning
  • Very short time may indicate guessing
  • Optimal: 30-90 seconds per question

3. Risk Analysis

What It Shows: Questions with highest failure rates, especially critical onesColumns:
  • Question: The scenario/question text
  • Category: Topic area (Ethics, Compliance, etc.)
  • Failures: Number of users who answered incorrectly
  • Error Rate: Percentage who got it wrong
  • Risk Level: Impact assessment (High/Medium/Low)
How to Use:
  1. Identify Training Gaps: High error rates = users don’t understand this topic
  2. Flag Compliance Risks: Critical questions with high failure = real organizational risk
  3. Improve Content: If error rate >50%, question may be unclear (rewrite or replace)
  4. Plan Interventions: Create targeted training for high-risk categories
Critical questions with >30% error rate represent significant organizational risk. Escalate to client leadership immediately.
Action Steps:
  • Generate list of users who failed critical questions
  • Require re-training or manager conversation
  • Consider policy reminders or process changes
  • Track improvement in subsequent campaigns

Using Filters in Analíticas

Maximize insight by combining filters:
1

Set Date Range

  • Start Date: Beginning of campaign or analysis period
  • End Date: Current date or campaign end
Compare same date ranges across different periods (e.g., Jan vs Feb) to track improvement.
2

Filter by Client

For multi-client admins:
  • Select specific client to isolate their data
  • Leave blank to see aggregate across all clients
3

Filter by Dilemma

Focus on specific game:
  • Evaluate individual dilemma performance
  • Compare user behavior across different formats (Angel/Demon vs Millonario)
4

Apply and Export

After setting filters:
  1. Click “Filtrar” to update all charts
  2. Wait for data to load
  3. Use browser print function or screenshot for reports
  4. Click export button if available for raw data

Weekly Report

The Weekly Report section provides an executive-friendly summary perfect for stakeholder communication.

Understanding Weekly KPIs

1

Access Weekly Report

Navigate to Weekly Report from the sidebar or dashboard tabs.
2

Review Summary Metrics

The top KPI cards show:Sessions: Total sessions started this week
  • Compare to previous weeks to track momentum
Unique Users: Number of individuals who participated
  • More important than total sessions (user reach)
Average Score: Mean performance across all sessions
  • Track learning effectiveness week-over-week
Completed Sessions: Sessions finished vs started
  • Completion Rate % shows engagement quality
Pending Users: Users with incomplete dilemmas >3 days old
  • Prime targets for re-engagement emails
3

Analyze Daily Activity Chart

The bar chart shows sessions per day over the report period:
  • Spikes: Correspond with reminders, deadlines, or promotions
  • Valleys: Weekends, holidays, or lack of communication
  • Trend: Is activity increasing, stable, or declining?
A healthy campaign shows steady or increasing activity. Declining activity requires intervention.
4

Review Top Lists

Three ranking lists provide quick insights:Top Dilemmas: Most-played games this week
  • Shows which content resonates
  • Consider promoting similar content
Active Clients: Clients with most user activity
  • Recognize engaged organizations
  • Learn best practices to share with others
New Users: Count of users registered this week
  • Track onboarding effectiveness
  • Correlate with client expansion efforts

Sending Weekly Reports via Email

1

Configure Report Period

Use date range filters to set the reporting period:
  • Default: Last 7 days
  • Custom: Any start/end date range
2

Preview Report

Click “Ver HTML” to preview the email version:
  • See exactly what recipients will receive
  • Includes all charts and data tables
  • Formatted for easy reading in email clients
3

Send to Stakeholders

  1. Click “Enviar por Email” button
  2. Enter recipient emails (comma-separated): [email protected], [email protected], [email protected]
  3. Review the summary message
  4. Click “Enviar”
The system sends a professionally formatted HTML email with all KPIs, charts, and insights. Recipients can forward to their teams.
4

Verify Delivery

  • Success message confirms email sent
  • Recipients should receive within minutes
  • If not received, check spam filters and email addresses

Customizing Weekly Reports

Create a regular cadence (e.g., every Monday morning) to send weekly reports. Consistency builds stakeholder confidence and demonstrates ongoing value.
Best Practices:
  • Add a personal message summarizing key insights
  • Highlight 2-3 main takeaways
  • Include specific recommendations or next steps
  • Celebrate wins (high scores, increased participation)
  • Address concerns proactively (low engagement, risk areas)

Question Analytics (Analítica Transversal)

For granular analysis of individual questions:
1

Access Question Analytics

Navigate to Preguntas (Questions) from the sidebar.
2

Apply Filters

Filter to focus your analysis:
  • Client: Specific organization
  • Dilemma: Specific game
  • Difficulty: High/Medium/Low based on performance
  • Search: Find questions by text
  • Order By: Sort by error rate, attempts, difficulty, etc.
3

Analyze Question Table

The table shows for each question:
ColumnMeaningAction Threshold
QuestionText of the scenarioLook for critical indicator (red dot)
ClientOrganizationFilter for client-specific analysis
DilemmaWhich gameCompare across games
AttemptsHow many times answeredLow attempts = not enough data
Success Rate% answered correctly<60% = potential issue
Error Rate% answered incorrectly>40% = review needed
DifficultyCalculated difficulty levelTrack if aligns with intent
4

Identify Problematic Questions

Sort by Error Rate (descending) to find:High Error Rate (>50%):
  • Question may be ambiguous
  • Correct answer may be wrong
  • Topic may need better explanation
  • Users genuinely don’t understand concept
Action: Click “Detalle” to review specific user responses and determine root cause
5

Track Critical Questions

Filter to show only critical questions (or look for red dot indicator):
  • These address compliance, legal, or safety topics
  • High failure rates = organizational risk
  • Require immediate attention and follow-up
If a critical question on harassment, data privacy, or safety has >30% error rate, escalate immediately to client leadership. This may indicate serious organizational risk.

Question Performance Deep Dive

For any question of concern:
  1. Click “Detalle” next to the question
  2. View all user responses:
    • Who answered correctly vs incorrectly
    • Which wrong answers were chosen (reveals misconceptions)
    • Time spent on question (too fast = guessing)
  3. Identify patterns:
    • All users in specific department failing?
    • Certain demographics struggling?
    • One wrong answer chosen by everyone? (may be written ambiguously)
  4. Take action:
    • Rewrite question for clarity
    • Provide additional training on topic
    • Follow up with users who failed

Exporting Data for Advanced Analysis

All major sections offer CSV export:
1

Configure Filters First

Before exporting, set filters to get exactly the data you need:
  • Client(s)
  • Date range
  • Dilemma(s)
  • Specific segments
2

Click Export Button

Look for “Exportar CSV” or “Exportar a Excel” buttons:
  • Clientes: Export client metrics
  • Usuarios: Export user progress data
  • Preguntas: Export question performance
  • Analíticas: May export chart data
3

Open in Spreadsheet Tool

The CSV file can be opened in:
  • Microsoft Excel
  • Google Sheets
  • Data analysis tools (Python, R, Tableau, etc.)
4

Perform Advanced Analysis

Common analyses:
  • Pivot Tables: Cross-tab any dimensions (Area × Score, Dilemma × Completion Rate)
  • Trend Analysis: Plot time-series of engagement or scores
  • Cohort Analysis: Compare user groups (early adopters vs late, high performers vs low)
  • Regression: Identify factors predicting success (department, time spent, etc.)
For clients who want to integrate with their own BI tools (Power BI, Tableau), export CSVs on a regular schedule and provide as data feeds.

Creating Client-Facing Reports

Monthly Business Review Template

Slide 1: Overview
  • Campaign name and period
  • Total users invited vs completed
  • Overall completion rate and average score
  • Screenshot of dashboard with KPIs
Slide 2: Engagement Metrics
  • Participation rate trend (week-by-week)
  • Activity over time chart
  • Top engaged departments/areas
  • Recognition of top performers
Slide 3: Learning Outcomes
  • Average scores by dilemma
  • Performance by department (bar chart)
  • Competency radar showing strengths/gaps
  • Score distribution
Slide 4: Risk & Opportunities
  • Critical questions with high failure rates
  • Categories needing attention
  • Specific recommendations for follow-up training
  • Action plan for next period
Slide 5: Success Stories
  • Testimonials from participants
  • Behavior change anecdotes
  • Certificates issued
  • Next steps and future campaigns

Data Storytelling Best Practices

Focus on Insights, Not Just Data: Don’t just show numbers—explain what they mean and why they matter.❌ Bad: “Completion rate is 67%”✅ Good: “67% completion rate exceeds industry benchmark of 55%, demonstrating strong employee engagement with ethics training. The 33% who haven’t completed represent an opportunity to reinforce these critical concepts.”
Use Visuals: Charts are processed 60x faster than tables. Use:
  • Line charts for trends over time
  • Bar charts for comparisons
  • Pie/donut charts for part-to-whole relationships
  • Tables only for precise values or lookup
Tell a Story: Structure reports with a narrative:
  1. Setup: What we did (launched dilemma campaign)
  2. Conflict: Challenges faced (low initial engagement)
  3. Resolution: Actions taken (reminders, manager involvement)
  4. Result: Outcomes achieved (80% completion, high scores)
  5. Next Chapter: Future plans (new dilemmas, expansion)

Benchmarking and Targets

Industry Benchmarks

Compare your metrics to typical performance:
MetricPoorAverageExcellent
Participation Rate<50%50-70%>70%
Completion Rate<40%40-60%>60%
Average Score<60%60-75%>75%
Time to Complete>30 min15-30 min<15 min
Re-engagement Success<10%10-20%>20%
Benchmarks vary by industry, organization size, and content difficulty. Track your own historical performance as the best comparison.

Setting SMART Goals

For each campaign, establish specific targets: Example:
  • Specific: Achieve 75% completion rate
  • Measurable: Track in Usuarios section weekly
  • Achievable: Based on 65% in last campaign
  • Relevant: Aligns with client’s L&D objectives
  • Time-bound: Within 4-week campaign window

Troubleshooting Analytics Issues

Possible Causes:
  • Browser cache
  • Database caching
  • Server processing delay
Solutions:
  1. Hard refresh browser (Ctrl+Shift+R / Cmd+Shift+R)
  2. Wait 5-10 minutes for batch processing
  3. Clear browser cache and cookies
  4. Try in incognito/private window
Possible Causes:
  • JavaScript errors
  • Chart.js library not loading
  • Ad blocker interference
Solutions:
  1. Check browser console for errors (F12 → Console)
  2. Disable ad blockers/privacy extensions
  3. Try different browser
  4. Ensure JavaScript is enabled
Possible Causes:
  • Filters not applied before export
  • Encoding issues with special characters
Solutions:
  1. Re-apply filters and export again
  2. Open CSV with UTF-8 encoding
  3. Use Excel’s “Get Data from Text/CSV” with proper encoding

Next Steps

Master Admin Workflow

See how analytics fit into your complete administrative process

Improve User Engagement

Use insights to optimize user participation and completion

Refine Dilemma Content

Apply analytics to improve question quality and difficulty

Onboard More Clients

Use proven analytics to demonstrate value to new clients

Build docs developers (and LLMs) love