Skip to main content
Rowboat includes background services that continuously monitor your data sources and maintain your knowledge graph. These agents work autonomously to keep your information up-to-date without manual intervention.

Service Architecture

Rowboat runs multiple independent services, each responsible for a specific data source or workflow:

Gmail Sync

Monitors your Gmail inbox for new threads and syncs them to markdown filesInterval: Every 5 minutes

Fireflies Sync

Pulls meeting transcripts from Fireflies and saves them locallyInterval: Every 5 minutes

Granola Sync

Syncs meeting notes from GranolaInterval: Every 5 minutes

Knowledge Graph Builder

Processes all synced content to extract entities and update your knowledge baseInterval: Every 30 seconds

How Services Work

1. Data Source Services

Each data source (Gmail, Fireflies, Granola) runs as an independent service:
export async function init() {
  console.log("Starting Gmail Sync...");
  console.log(`Will sync every ${SYNC_INTERVAL_MS / 1000} seconds.`);

  while (true) {
    try {
      const hasCredentials = await checkCredentials();
      
      if (!hasCredentials) {
        console.log("OAuth credentials not available. Sleeping...");
      } else {
        await performSync();
      }
    } catch (error) {
      console.error("Error in main loop:", error);
    }

    await sleep(SYNC_INTERVAL_MS);
  }
}
Gmail Sync Flow:
1

Check for changes

Uses Gmail History API with stored historyId to detect new messages efficiently
2

Fetch thread details

For each new thread, fetches full message content including attachments
3

Convert to markdown

Converts HTML emails to clean markdown format
4

Save locally

Writes thread to ~/.rowboat/gmail_sync/{threadId}.md
5

Update state

Saves new historyId for next sync
Incremental vs Full Sync:
  • First run: Full sync of last 30 days
  • Subsequent runs: Incremental sync using Gmail History API
  • Fallback: If historyId expires (404), falls back to full sync

2. Knowledge Graph Builder

The graph builder service monitors all source folders:
const SOURCE_FOLDERS = [
  'gmail_sync',
  'fireflies_transcripts',
  'granola_notes',
];

export async function init() {
  console.log('[GraphBuilder] Starting Knowledge Graph Builder Service...');
  console.log(`[GraphBuilder] Monitoring folders: ${SOURCE_FOLDERS.join(', ')}`);
  
  // Initial run
  await processAllSources();
  
  // Periodic processing
  while (true) {
    await sleep(SYNC_INTERVAL_MS);
    await processAllSources();
  }
}
The graph builder checks for new files every 30 seconds, processes them in batches, and updates your knowledge base incrementally.

Service Logging

All services log their activity to a central log file for debugging and monitoring:
const run = await serviceLogger.startRun({
  service: 'gmail',
  message: 'Syncing Gmail',
  trigger: 'timer',
});

await serviceLogger.log({
  type: 'changes_identified',
  service: run.service,
  runId: run.runId,
  level: 'info',
  message: `Found ${threadIds.length} threads to sync`,
  counts: { threads: threadIds.length },
});

await serviceLogger.log({
  type: 'run_complete',
  service: run.service,
  runId: run.runId,
  level: 'info',
  message: `Gmail sync complete: ${threadIds.length} threads`,
  durationMs: Date.now() - run.startedAt,
  outcome: 'ok',
  summary: { threads: threadIds.length },
});
Logs are stored at: ~/.rowboat/logs/services.jsonl
{
  "type": "run_start",
  "service": "gmail",
  "runId": "gmail_1234567890",
  "level": "info",
  "message": "Syncing Gmail",
  "trigger": "timer",
  "ts": "2026-02-28T10:30:00.000Z"
}

{
  "type": "changes_identified",
  "service": "gmail",
  "runId": "gmail_1234567890",
  "level": "info",
  "message": "Found 5 threads to sync",
  "counts": { "threads": 5 },
  "items": ["thread1.md", "thread2.md", "thread3.md", "thread4.md", "thread5.md"],
  "ts": "2026-02-28T10:30:15.000Z"
}

{
  "type": "run_complete",
  "service": "gmail",
  "runId": "gmail_1234567890",
  "level": "info",
  "message": "Gmail sync complete: 5 threads",
  "durationMs": 12340,
  "outcome": "ok",
  "summary": { "threads": 5 },
  "ts": "2026-02-28T10:30:27.340Z"
}
Log rotation: Files rotate after 10MB to services.{timestamp}.jsonl

Agent Runtime

The agent runtime handles executing AI workflows for note creation:

Processing Batches

When new files need processing:
1

Build knowledge index

Scans existing notes to build in-memory index of all entities
2

Create agent run

Spawns a new agent instance with the note_creation agent
3

Build context

Prepares a message with:
  • Knowledge base index
  • Batch of files to process
  • Instructions for entity extraction
4

Execute agent

Agent reads files, extracts entities, creates/updates notes
5

Track changes

Monitors which notes were created or modified
6

Update state

Marks files as processed and commits changes to git

Agent Configuration

Agents are configured using markdown files with YAML frontmatter:
---
model: claude-3-5-sonnet-20241022
tools:
  workspace-readFile:
    type: builtin
    name: readFile
  workspace-writeFile:
    type: builtin
    name: writeFile
  workspace-edit:
    type: builtin
    name: edit
---

# Agent Instructions

You are a note creation agent...
The runtime loads agents dynamically:
export async function loadAgent(id: string): Promise<Agent> {
  if (id === 'note_creation') {
    const strictness = getNoteCreationStrictness();
    let raw = '';
    switch (strictness) {
      case 'medium': raw = noteCreationMediumRaw; break;
      case 'low': raw = noteCreationLowRaw; break;
      case 'high': raw = noteCreationHighRaw; break;
    }
    // Parse frontmatter and return agent config
  }
  // Load from ~/.rowboat/agents/ directory
}
The note creation agent has three variants (high/medium/low) based on your configured strictness level.

Error Handling

Services include robust error handling:

Automatic Retry

If a sync fails, the service:
  1. Logs the error
  2. Continues to the next scheduled run
  3. Will retry on the next iteration

Partial Progress

The knowledge graph builder saves progress after each batch:
for (let i = 0; i < contentFiles.length; i += BATCH_SIZE) {
  const batch = contentFiles.slice(i, i + BATCH_SIZE);
  
  try {
    await createNotesFromBatch(batch, batchNumber, indexForPrompt);
    
    // Mark files as processed
    for (const file of batch) {
      markFileAsProcessed(file.path, state);
    }
    
    // Save state after successful batch
    saveState(state);
  } catch (error) {
    console.error(`Error processing batch ${batchNumber}:`, error);
    // Continue with next batch
  }
}
This ensures that if processing fails mid-way, already-processed files won’t be repeated.

OAuth Expiration

If OAuth credentials expire (401 error):
if (error.response?.status === 401) {
  console.log("401 Unauthorized, clearing cache");
  GoogleClientFactory.clearCache();
}
The service clears cached credentials and prompts for re-authentication on next run.

Manual Triggers

You can trigger syncs manually without waiting for the scheduled interval:
import { triggerSync } from './sync_gmail.js';

triggerSync(); // Wakes up the sync service immediately
This uses an interruptible sleep pattern:
function interruptibleSleep(ms: number): Promise<void> {
  return new Promise(resolve => {
    const timeout = setTimeout(() => {
      wakeResolve = null;
      resolve();
    }, ms);
    wakeResolve = () => {
      clearTimeout(timeout);
      resolve();
    };
  });
}

export function triggerSync(): void {
  if (wakeResolve) {
    wakeResolve();
    wakeResolve = null;
  }
}

Source Code Reference

Key implementation files:
  • apps/x/packages/core/src/knowledge/build_graph.ts:522-657 - Main graph builder service
  • apps/x/packages/core/src/knowledge/sync_gmail.ts:443-467 - Gmail sync service
  • apps/x/packages/core/src/agents/runtime.ts:36-148 - Agent runtime
  • apps/x/packages/core/src/services/service_logger.ts - Service logging infrastructure

Build docs developers (and LLMs) love