Skip to main content
Effective moderation ensures your Campus community remains safe, respectful, and productive. This guide covers the moderation system, reporting tools, and best practices.

Understanding Moderation

Campus provides a structured moderation system with:
  • User-generated reports
  • Automated flagging
  • Moderation queue
  • Action logging
  • Real-time updates

Moderation Permissions

Who can moderate content:
RoleCan Moderate
Content AuthorOwn posts and comments
Group ModeratorAll content in their groups
Space ModeratorAll content in their spaces
Space OwnerAll content in their spaces

The Moderation System

Moderation Cases

When content is flagged, a moderation case is created:
{
  id: string,
  sourceType: 'post' | 'comment' | 'message',
  sourceId: string,           // ID of flagged content
  state: 'open' | 'in_review' | 'escalated' | 'resolved',
  evidence: Array<{
    type: string,             // 'user_report' | 'auto_flag'
    reason: string,           // Reason for flag
    body: string,             // Content snapshot
    capturedAt: DateTime
  }>,
  assignedTo: string,         // Moderator user ID
  created: DateTime,
  updated: DateTime
}

Case States

1

Open

New case, awaiting review
  • Appears in moderation queue
  • SLA timer starts (15 minutes)
  • Automatically assigned or available for claim
2

In Review

Moderator is actively investigating
  • Assigned to specific moderator
  • Evidence is being evaluated
  • Actions are being considered
3

Escalated

Case requires higher-level attention
  • Complex or sensitive situation
  • Potential policy violation
  • Requires administrator decision
4

Resolved

Case is closed
  • Action taken (or no action needed)
  • Logged for records
  • Archived but searchable

Accessing the Moderation Dashboard

Only users with moderator or owner roles can access the moderation dashboard.
  1. Navigate to /admin/moderation
  2. View the moderation queue with active cases
  3. See case statistics in the overview cards

Dashboard Overview

The dashboard shows: Statistics Cards
  • Open cases count
  • In review cases count
  • Escalated cases count (highlighted in red)
  • Resolved cases count (shown in green)
Filter Options
  • All cases
  • Open only
  • In review only
  • Escalated only
  • Resolved only
Real-time Updates The dashboard subscribes to live updates, so new cases appear automatically without refreshing.

Handling Reports

Viewing a Case

1

Access the Case

Click on a case card in the moderation queue to expand details.
2

Review Evidence

Examine all evidence entries:
  • Type of report (user report or auto-flag)
  • Reason provided
  • Content snapshot
  • Number of flags from different users
Multiple reports on the same content are consolidated into one case with multiple evidence entries.
3

View Source Content

Click “View Source” to see the content in context:
  • For posts: Opens the post in the feed
  • For comments: Navigates to the comment location
  • For messages: Opens the message thread
This helps you understand the full context.
4

Assess Violation

Determine if content violates community standards:
  • Check against posted rules
  • Consider context and intent
  • Review user’s history if needed
  • Consult with other moderators for complex cases

Taking Action

Depending on your assessment:
If content doesn’t violate rules:
  1. Click “Resolve Case”
  2. Case is marked as resolved
  3. Reporter receives notification (optional)
The content remains visible and no further action is taken.

SLA Management

SLA Breach AlertCases open for more than 15 minutes are marked with a red “SLA Breach” badge. Prioritize these cases to ensure timely moderation.
Best practices for meeting SLA:
  • Check the queue regularly (multiple times per day)
  • Assign multiple moderators for coverage
  • Set up notifications for new cases
  • Have clear guidelines for quick decisions

Automated Moderation

Auto-flagging

Campus automatically flags certain content: Video Post Flags Video posts are automatically flagged and sent for review with metadata:
{
  type: 'auto_flag',
  value: 'video_post',
  meta: {
    scope: string,          // Where posted
    attachmentCount: number,
    hasAltText: boolean,
    hasPoster: boolean,     // Thumbnail present
    videoDuration: number,  // In seconds
    contentLength: number,  // Caption length
    containsLinks: boolean,
    wordCount: number
  }
}
Auto-flagged cases:
  • Start in “in_review” state
  • Require moderator approval
  • Help catch policy violations early

Configuring Auto-flags

Administrators can configure automated flagging rules for:
  • Specific content types
  • Keyword patterns
  • Spam detection
  • Link analysis
  • User behavior patterns

User Reporting

How Users Report Content

Users can report problematic content:
  1. Click “Report” on a post or comment
  2. Select a reason:
    • Spam or misleading
    • Harassment or hate speech
    • Violence or dangerous content
    • Inappropriate or offensive
    • Other (with explanation)
  3. Optionally provide additional context
  4. Submit report
Reports are anonymous to maintain reporter safety.

Handling Multiple Reports

When multiple users report the same content:
  • Evidence is appended to existing case
  • Flag count is tracked
  • Duplicate reports are deduplicated
  • Higher priority in queue
// Evidence deduplication
function evidenceSignature(entry) {
  return JSON.stringify({
    type: entry.type,
    value: entry.value,
    meta: entry.meta
  });
}

Moderation Actions

Deleting Posts

To delete a post with proper logging:
await deletePostModerated(postId);
This function:
  1. Verifies you have permission
  2. Deletes the post
  3. Creates a moderation log entry
  4. Returns success status

Deleting Comments

To delete a comment:
await deleteCommentModerated(commentId);
Process:
  1. Checks moderation permission
  2. Deletes the comment
  3. Logs the action with post context
  4. Maintains thread integrity

Moderation Logs

All moderation actions are logged:
{
  actor: string,        // Moderator user ID
  action: string,       // 'delete_post' | 'delete_comment' | etc.
  meta: object,         // Action details
  timestamp: DateTime
}
Logs are used for:
  • Accountability
  • Audit trails
  • Pattern analysis
  • Training and improvement

Moderation Best Practices

Apply rules uniformly:
  • Document your decisions
  • Create moderation guidelines
  • Have regular team meetings
  • Review cases together to calibrate
Inconsistency erodes trust and fairness.
Respond to reports promptly:
  • Check queue at least 3x daily
  • Prioritize SLA breaches
  • Handle clear violations immediately
  • Escalate uncertain cases quickly
Fast response prevents issues from escalating.
When taking action:
  • Explain the violation (when appropriate)
  • Reference specific rules
  • Provide appeals process
  • Be respectful but firm
Clear communication helps users understand and comply.
For repeat offenders:
  1. First violation: Warning + delete content
  2. Second violation: Temporary restriction
  3. Third violation: Extended restriction
  4. Persistent violations: Permanent ban
Document all steps for accountability.
Never reveal who reported content:
  • Reports are anonymous
  • Don’t mention “someone reported”
  • Focus on the violation, not the report
Protection encourages reporting.
Moderation can be emotionally taxing:
  • Rotate difficult content among team
  • Take breaks when needed
  • Debrief with fellow moderators
  • Know when to escalate
Your wellbeing matters too.

Notifications and Alerts

Moderator Notifications

Moderators receive notifications for: New Cases
  • When content is reported
  • When auto-flags are triggered
  • Real-time in the dashboard
Escalations
  • When a case is escalated
  • Email notification to administrators
  • Uses template: moderation-escalated.html
Daily Summary
  • Daily digest of moderation activity
  • Statistics and trends
  • Uses template: moderation-daily-summary.html

Configuring Notifications

Administrators can configure:
  • Who receives notifications
  • Notification frequency
  • Escalation thresholds
  • Email templates

Moderation Reports and Analytics

Key Metrics

Track moderation effectiveness:
  • Average resolution time: How long cases stay open
  • SLA compliance rate: Percentage resolved within 15 minutes
  • Case volume trends: Increasing or decreasing reports
  • Action breakdown: Deletions vs. no action vs. escalations
  • Top reporters: Users who report most frequently
  • Top reported users: Users most frequently reported

Using Analytics

Insights help you:
  • Identify problematic users or patterns
  • Adjust moderation resources
  • Update community guidelines
  • Train new moderators
  • Demonstrate moderation effectiveness

Common Scenarios

Spam Posts

Identification:
  • Repeated content
  • Promotional links
  • Irrelevant to community
Action:
  • Delete immediately
  • Check for other spam from user
  • Consider account restriction

Heated Arguments

Identification:
  • Multiple reports
  • Personal attacks
  • Derailed conversation
Action:
  • Review full thread
  • Delete inflammatory comments
  • Consider locking thread
  • Message users privately

Inappropriate Content

Identification:
  • NSFW material
  • Offensive language
  • Violates ToS
Action:
  • Delete immediately
  • Warn or suspend user
  • Document for repeat offenders

Harassment

Identification:
  • Targeted negativity
  • Repeated unwanted contact
  • Threatening language
Action:
  • Escalate immediately
  • Document all evidence
  • Consider user suspension
  • Notify administrators

Appeals Process

Users should have a way to appeal moderation decisions:
1

User Submits Appeal

Provide a contact method (email, form, etc.) for appeals
2

Different Moderator Reviews

Have a different moderator review the appeal to avoid bias
3

Decision Communicated

Clearly explain the decision:
  • If upheld: Why the original decision stands
  • If reversed: Apologize and restore content if appropriate
4

Learn from Appeals

Use successful appeals to:
  • Improve guidelines
  • Train moderators
  • Update policies

Next Steps

Manage Spaces

Learn how to configure and manage your spaces

Manage Groups

Organize your community with groups and manage members

Build docs developers (and LLMs) love