Overview
Polaris IDE’s conversation system provides an intelligent AI assistant integrated directly into your IDE. The assistant can understand your code, execute tools to modify files, run commands, and provide contextual help through a real-time streaming interface.Message history and storage
Conversations are persisted in Convex database with the following structure:- Conversations: Each project has multiple conversations
- Messages: Each conversation contains user and assistant messages
- Tool calls: Messages can include tool executions (file operations, searches, etc.)
- Status tracking: Messages track processing state (pending, processing, completed, failed)
All conversations are project-scoped, meaning each project maintains its own conversation history.
Message flow
Message creation
API creates both user message and empty assistant message with “processing” status.
Background job trigger
Trigger.dev task is launched to process the assistant’s response asynchronously.
Streaming response
AI generates response with tool calls, streamed back in real-time with 100ms throttle.
Streaming responses
The conversation system uses real-time streaming to display AI responses as they’re generated:Streaming implementation
Streaming updates are throttled to 100ms intervals to prevent database overload while maintaining smooth user experience.
Context management
The AI assistant receives comprehensive context for each message:System prompt
The assistant is initialized with a detailed system prompt that explains:- Its identity as “Polaris, an AI coding assistant”
- Available tools and their purposes
- When to use each tool category
- Best practices for code generation
Message context
Tool calling system
The conversation AI has access to powerful tools for interacting with your project:Available tool categories
File management
Read, write, delete files, list directories, get project structure
LSP (Language Server)
Find symbols, get references, diagnostics, go to definition
Code search
Regex search, AST-aware search, find files by pattern
Context & relevance
Find relevant files using import analysis and symbol matching
Terminal
Execute safe commands (npm, git, node, tsc, eslint, etc.)
Tool execution flow
Trigger.dev background jobs integration
Conversations use Trigger.dev for asynchronous message processing:Why Trigger.dev?
- Long-running tasks - AI responses can take several seconds
- Reliable execution - Automatic retries and error handling
- Cancellation support - Users can cancel in-progress messages
- Status tracking - Monitor job progress in real-time
Background job lifecycle
Message status tracking
Messages progress through these states:| Status | Description | User visible |
|---|---|---|
pending | Message created, waiting to be processed | Loading indicator |
processing | AI is generating response | Streaming text appears |
completed | Response fully generated | Full message visible |
failed | Error occurred during processing | Error message shown |
cancelled | User cancelled the message | Cancelled indicator |
API implementation
The messages endpoint is located at/api/messages:
Performance metrics
The conversation system tracks performance metrics:Error handling
The conversation system handles errors gracefully:- Processing errors - Message status set to “failed” with error message
- Conversation not found - Returns 404 with clear error
- Unauthorized access - Returns 403 for unauthenticated requests
- Internal errors - Logged and returned as 500 with safe error message
Source code reference
Implementation details:- API route:
src/app/api/messages/route.ts:20 - Background task:
trigger/tasks/process-message.ts:53 - System prompt:
trigger/tasks/process-message.ts:17 - Tool creation:
trigger/tasks/process-message.ts:70 - Streaming logic:
trigger/tasks/process-message.ts:91