Skip to main content

Overview

Polaris IDE uses Sentry for comprehensive error tracking, performance monitoring, and AI operation insights.
Sentry helps you catch and fix issues before users report them, and provides AI-specific monitoring for LLM calls.

Features

Error Tracking

Capture and aggregate JavaScript errors with stack traces

Performance Monitoring

Track page load times, API latency, and Core Web Vitals

AI Monitoring

Monitor LLM calls, token usage, and AI performance

Console Logging

Capture console.log/warn/error for debugging

Setup

Installation

Sentry is pre-installed in Polaris:
package.json
{
  "dependencies": {
    "@sentry/nextjs": "^10.32.1"
  }
}

Configuration Files

Polaris includes Sentry configuration for different environments:
Location: sentry.server.config.ts
import * as Sentry from "@sentry/nextjs";

Sentry.init({
  dsn: process.env.SENTRY_DSN,
  
  // Sample rate for performance monitoring
  tracesSampleRate: 1.0,
  
  // Enable server-side logging
  enableLogs: true,
  
  // Send user info (emails, IDs)
  sendDefaultPii: true,
  
  integrations: [
    // Monitor AI/LLM calls
    Sentry.vercelAIIntegration,
    
    // Capture console logs
    Sentry.consoleLoggingIntegration({
      levels: ["log", "warn", "error"]
    })
  ]
});

Environment Variables

Add to .env.local:
# Sentry DSN (get from sentry.io dashboard)
SENTRY_DSN=https://5a5ad5d9846faece0a4727540f810281@o4510149980258304.ingest.de.sentry.io/4510621155983440

# Optional: For client-side if different
NEXT_PUBLIC_SENTRY_DSN=$SENTRY_DSN

# Organization and project (for source maps)
SENTRY_ORG=your-org
SENTRY_PROJECT=polaris-ide

# Auth token for uploading source maps
SENTRY_AUTH_TOKEN=your-auth-token

Error Tracking

Automatic Error Capture

Sentry automatically captures:
// This error is automatically captured
throw new Error("Something went wrong!");
// Unhandled promise rejection captured
fetch("/api/data").then(res => {
  if (!res.ok) throw new Error("API failed");
});
// No .catch() - Sentry captures it
// Component errors automatically captured
function BuggyComponent() {
  const [count, setCount] = useState(0);
  
  if (count > 5) {
    throw new Error("Count too high!");
  }
  
  return <button onClick={() => setCount(count + 1)}>Count: {count}</button>;
}
// API errors automatically captured
export async function POST(req: Request) {
  const data = await req.json();
  
  if (!data.name) {
    throw new Error("Name is required");
  }
  
  return Response.json({ success: true });
}

Manual Error Capture

Capture errors explicitly with context:
import * as Sentry from "@sentry/nextjs";

try {
  await dangerousOperation();
} catch (error) {
  // Capture with additional context
  Sentry.captureException(error, {
    level: "error",
    tags: {
      operation: "project-creation",
      userId: user.id
    },
    extra: {
      projectName: name,
      timestamp: new Date().toISOString()
    },
    user: {
      id: user.id,
      email: user.email
    }
  });
  
  // Show user-friendly message
  toast.error("Failed to create project");
}
Add context to errors:
import * as Sentry from "@sentry/nextjs";

// Log user actions
Sentry.addBreadcrumb({
  category: "user-action",
  message: "User created new project",
  level: "info",
  data: {
    projectId: "abc123",
    projectName: "My Project"
  }
});

// Log API calls
Sentry.addBreadcrumb({
  category: "api",
  message: "POST /api/projects",
  level: "info",
  data: {
    status: 200,
    duration: "123ms"
  }
});

AI Monitoring

Sentry’s Vercel AI integration tracks LLM calls:

Automatic Tracking

import { streamText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";

// Automatically tracked by Sentry
const result = await streamText({
  model: anthropic("claude-sonnet-4-20250514"),
  messages: [
    { role: "user", content: "Write a React component" }
  ]
});

// Sentry captures:
// - Model used
// - Token count
// - Latency
// - Cost estimate
// - Errors

Monitored Metrics

Token Usage

  • Input tokens
  • Output tokens
  • Total tokens
  • Cost per request

Performance

  • Time to first token
  • Total generation time
  • Streaming latency

Model Stats

  • Model name and version
  • Provider (OpenRouter/Cerebras)
  • Success/failure rate

Errors

  • Rate limit errors
  • Invalid requests
  • API timeouts
  • Tool execution failures

Custom AI Metrics

import * as Sentry from "@sentry/nextjs";

const transaction = Sentry.startTransaction({
  name: "ai-code-suggestion",
  op: "ai.inference"
});

try {
  const startTime = Date.now();
  
  const suggestion = await generateSuggestion(code);
  
  const duration = Date.now() - startTime;
  
  transaction.setData("tokens", suggestion.usage.totalTokens);
  transaction.setData("model", "claude-sonnet-4");
  transaction.setData("duration_ms", duration);
  transaction.setStatus("ok");
  
  return suggestion.text;
} catch (error) {
  transaction.setStatus("error");
  Sentry.captureException(error);
  throw error;
} finally {
  transaction.finish();
}

Performance Monitoring

Page Load Tracking

Automatically monitors:
  • First Contentful Paint (FCP)
  • Largest Contentful Paint (LCP)
  • Cumulative Layout Shift (CLS)
  • First Input Delay (FID)
  • Time to First Byte (TTFB)

Custom Performance Metrics

import * as Sentry from "@sentry/nextjs";

// Measure specific operations
const span = Sentry.startSpan(
  { name: "load-project-files" },
  async () => {
    const files = await convex.query(api.files.getByProject, { projectId });
    return files;
  }
);

// Or manually
const transaction = Sentry.startTransaction({
  name: "compile-code",
  op: "task"
});

const compileSpan = transaction.startChild({
  op: "typescript.compile"
});

await compileTypeScript(code);
compileSpan.finish();

transaction.finish();

Database Query Monitoring

import * as Sentry from "@sentry/nextjs";

export const getProjects = query({
  handler: async (ctx) => {
    const transaction = Sentry.getCurrentHub().getScope()?.getTransaction();
    const span = transaction?.startChild({
      op: "db.query",
      description: "Get user projects"
    });
    
    const projects = await ctx.db
      .query("projects")
      .withIndex("by_owner", (q) => q.eq("ownerId", userId))
      .collect();
    
    span?.setData("result_count", projects.length);
    span?.finish();
    
    return projects;
  }
});

User Context

Set User Information

import * as Sentry from "@sentry/nextjs";
import { useUser } from "@stackframe/stack";

function UserProvider({ children }) {
  const user = useUser();
  
  useEffect(() => {
    if (user) {
      Sentry.setUser({
        id: user.id,
        email: user.primaryEmail,
        username: user.displayName,
        subscription: user.subscriptionTier
      });
    } else {
      Sentry.setUser(null);
    }
  }, [user]);
  
  return children;
}

Set Context Tags

import * as Sentry from "@sentry/nextjs";

// Set global tags
Sentry.setTag("environment", process.env.NODE_ENV);
Sentry.setTag("deployment", process.env.VERCEL_ENV);

// Set context for specific scope
Sentry.withScope((scope) => {
  scope.setTag("feature", "ai-chat");
  scope.setTag("project_id", projectId);
  scope.setContext("project", {
    name: project.name,
    fileCount: fileCount
  });
  
  Sentry.captureMessage("Project loaded", "info");
});

Alerts & Notifications

Configure Alerts in Sentry Dashboard

1

Navigate to Alerts

Go to AlertsCreate Alert Rule
2

Choose alert type

  • Issues: Error frequency threshold
  • Metric: Custom metric threshold
  • Uptime: Service availability
3

Set conditions

Example: Alert when error rate > 10/min
WHEN errors in project polaris-ide
IS GREATER THAN 10 events
IN 1 minute
4

Configure notifications

Send to:
  • Email
  • Slack
  • Discord
  • PagerDuty
  • Custom webhook

Slack Integration

1

Add Slack integration

Settings → Integrations → Slack → Install
2

Choose channel

Select #engineering or #alerts
3

Configure alert

Alerts will appear as:
🔴 New Issue: TypeError in Editor Component
• First seen: 2 minutes ago
• Events: 15
• Users affected: 3
• View: [Issue #1234]

Source Maps

Enable readable stack traces in production:

Automatic Upload

Sentry automatically uploads source maps during build:
next.config.js
const { withSentryConfig } = require("@sentry/nextjs");

module.exports = withSentryConfig(
  {
    // Next.js config
  },
  {
    // Sentry webpack plugin options
    silent: true,
    org: process.env.SENTRY_ORG,
    project: process.env.SENTRY_PROJECT
  }
);

Verify Upload

Check Sentry dashboard:
  1. Go to SettingsSource Maps
  2. Verify release artifacts are uploaded
  3. Check stack traces are deobfuscated

Filtering Noise

Ignore Known Errors

sentry.server.config.ts
Sentry.init({
  dsn: process.env.SENTRY_DSN,
  
  beforeSend(event, hint) {
    // Ignore network errors
    if (event.exception?.values?.[0]?.type === "NetworkError") {
      return null;
    }
    
    // Ignore specific messages
    const message = event.message || "";
    if (message.includes("ResizeObserver loop")) {
      return null;
    }
    
    return event;
  },
  
  ignoreErrors: [
    "ResizeObserver loop limit exceeded",
    "Non-Error promise rejection captured",
    /^AbortError:/
  ]
});

Sampling

Reduce data in production:
Sentry.init({
  tracesSampleRate: process.env.NODE_ENV === "production" ? 0.1 : 1.0,
  replaysSessionSampleRate: 0.1,
  profilesSampleRate: 0.1
});

Best Practices

Sentry.withScope((scope) => {
  scope.setTag("feature", "file-explorer");
  scope.setContext("operation", { type: "delete", fileId });
  Sentry.captureException(error);
});
// ❌ Bad
throw new Error("Failed");

// ✅ Good
throw new Error(`Failed to create project "${name}": ${reason}`);
// Track user journeys
Sentry.addBreadcrumb({ message: "User started project creation" });
Sentry.addBreadcrumb({ message: "Files uploaded" });
Sentry.addBreadcrumb({ message: "AI generation started" });
// If error occurs, you see the full journey

Troubleshooting

Solutions:
  1. Verify SENTRY_DSN is set correctly
  2. Check Sentry quota hasn’t been exceeded
  3. Ensure errors aren’t filtered by beforeSend
  4. Check browser console for Sentry errors
Solutions:
  1. Verify SENTRY_AUTH_TOKEN is set
  2. Check source maps uploaded: Settings → Source Maps
  3. Ensure release matches deployed version
  4. Rebuild with npm run build
Solutions:
  1. Reduce sample rates
  2. Add filters in beforeSend
  3. Use ignoreErrors for known issues
  4. Upgrade Sentry plan if needed

Next Steps

Background Jobs

Monitor Trigger.dev task failures

Architecture

Understand error flow

Deployment

Set up production monitoring

Contributing

Help improve error handling

Build docs developers (and LLMs) love