Skip to main content

Overview

Teak’s AI pipeline automatically enriches cards with metadata, making them easier to search, organize, and rediscover. All processing happens asynchronously after card creation.

Processing Workflow

When you create a card, it enters an automated AI workflow:
The workflow is defined in packages/convex/workflows/cardProcessing.ts:43-298.

Step 1: Classification

Determines the card type if not explicitly provided.
1

Check Explicit Type

If type was provided at creation, skip classification:
{
  content: "My idea",
  type: "text" // Explicit type
}
// Classification: { type: "text", confidence: 1.0 }
2

Run AI Classifier

For ambiguous content, an AI model analyzes the content:
{
  content: "Check out this amazing tutorial video"
  // No type specified
}
// AI determines: { type: "link", confidence: 0.85 }
For link cards, Teak fetches rich metadata and categorizes the content.

2a. Metadata Extraction

Fetches Open Graph data, page title/description, and images:
// Link card metadata structure
{
  metadata: {
    linkPreview: {
      title: "Understanding React Hooks",
      description: "A comprehensive guide to React Hooks...",
      imageUrl: "https://example.com/og-image.jpg",
      imageStorageId: "k1abc...",
      screenshotStorageId: "k2def...",
      status: "success"
    }
  },
  metadataTitle: "Understanding React Hooks",
  metadataDescription: "A comprehensive guide to React Hooks...",
  metadataStatus: "completed"
}
Metadata extraction uses the workflow in packages/convex/workflows/steps/linkMetadata/fetchMetadata.ts.

2b. Categorization

Classifies links into 15 categories:
CategoryDescription
bookBooks, eBooks
movieMovies, films
tvTV shows, series
articleLong-form articles
newsNews briefs, updates
podcastPodcast episodes
musicMusic tracks, albums
productShopping pages
recipeRecipes, cooking guides
courseCourses, tutorials
researchResearch papers
eventEvents, webinars
softwareSoftware, apps, GitHub
design_portfolioDesign portfolios
otherMiscellaneous
Categories are defined in packages/convex/shared/linkCategories.ts:1-116.

Step 3: AI Metadata Generation

Generates tags, summaries, and transcripts based on card type.

AI Tags

Automatically extracts relevant topics and keywords:
{
  aiTags: [
    "react",
    "hooks",
    "javascript",
    "frontend",
    "web-development"
  ]
}
AI tags are searchable via the search_ai_tags index.

AI Summary

Generates concise summaries for:
  • Links: Summarizes page content
  • Videos: Summarizes visual content
  • Documents: Extracts key points
  • Audio: Summarizes spoken content
  • Images: Describes visual content
{
  aiSummary: "A comprehensive guide covering useState, useEffect, and custom hooks in React. Includes practical examples and best practices for functional components."
}
Summaries are searchable via the search_ai_summary index.

AI Transcript

Generates transcripts for audio and video content:
{
  aiTranscript: "Welcome to this tutorial on React Hooks. Today we'll cover the basics of useState and useEffect...",
  type: "audio"
}
Transcripts make audio/video content fully searchable via the search_ai_transcript index.

Step 4: Renderables

Generates thumbnails and visual assets.

Thumbnail Generation

Applies to:
  • Videos: Extracts first frame
  • Documents: Renders first page
  • SVG Images: Rasterizes for palette extraction
{
  thumbnailId: "k3ghi...",
  thumbnailUrl: "https://..." // Generated by storage.getUrl()
}
For SVGs, thumbnails enable color palette extraction which would otherwise fail on vector graphics.

Processing Status

Track AI processing progress:
interface ProcessingStatus {
  classify: {
    status: "pending" | "running" | "completed" | "failed";
    startedAt?: number;
    completedAt?: number;
    confidence?: number;
  };
  categorize?: {
    status: "pending" | "running" | "completed" | "failed";
    startedAt?: number;
    completedAt?: number;
  };
  metadata: {
    status: "pending" | "running" | "completed" | "failed";
    startedAt?: number;
    completedAt?: number;
  };
  renderables?: {
    status: "pending" | "running" | "completed" | "failed";
    startedAt?: number;
    completedAt?: number;
  };
}

Querying Status

const card = useQuery(api.cards.getCard, { cardId });

if (card?.processingStatus?.metadata?.status === "running") {
  // Show loading indicator
} else if (card?.processingStatus?.metadata?.status === "completed") {
  // Display AI tags and summary
}

Retry Logic

Teak uses exponential backoff for resilient processing:
StepMax AttemptsInitial BackoffBase
Classification8400ms1.8
Link Metadata55000ms2.0
Categorization51200ms1.6
AI Metadata8400ms1.8
Retry configuration is in packages/convex/workflows/cardProcessing.ts:19-33.

AI Provider Configuration

Teak’s AI features use OpenAI models (configurable via environment variables):
# .env.local
OPENAI_API_KEY=sk-...
AI features require valid API keys. Cards will fail gracefully if keys are missing.

Performance Considerations

1

Parallel Processing

For most cards, metadata and renderables run in parallel:
// From cardProcessing.ts:241-260
const [metadataResult, renderablesResult] = await Promise.all([
  step.runAction(metadata.generate, { cardId }),
  step.runAction(renderables.generate, { cardId })
]);
2

Sequential for Videos

Videos need thumbnails before AI analysis:
// Generate thumbnail first
const renderablesResult = await step.runAction(
  renderables.generate,
  { cardId }
);

// Then run AI metadata (uses thumbnail)
const metadataResult = await step.runAction(
  metadata.generate,
  { cardId }
);
3

Palette Extraction

Images get palette extraction in parallel with other processing:
// Runs alongside metadata generation
const palettePromise = step.runAction(
  extractPaletteFromImage,
  { cardId }
);

Disabling AI Features

Currently, AI processing is automatic and cannot be disabled per-card. This may change in future versions.
If you want to skip AI processing, you can check processingStatus and only show AI fields when status === "completed".

Best Practices

1

Provide Context in Notes

The notes field helps AI generate better tags and summaries:
await createCard({
  content: "Design inspiration",
  url: "https://dribbble.com/shots/...",
  notes: "Minimal UI design for a productivity app, focus on the sidebar navigation"
});
2

Use User Tags to Supplement AI

Combine manual tags with AI tags:
{
  tags: ["work", "urgent"], // Manual organization
  aiTags: ["react", "typescript", "frontend"] // AI-generated
}
3

Monitor Processing Status

Show loading states while AI processes:
{card.processingStatus?.metadata?.status === "running" && (
  <Spinner>Generating AI tags...</Spinner>
)}

Source Reference

  • Main workflow: packages/convex/workflows/cardProcessing.ts:43-298
  • Link metadata: packages/convex/linkMetadata.ts
  • Link categories: packages/convex/shared/linkCategories.ts
  • Processing status: packages/convex/card/processingStatus.ts

Build docs developers (and LLMs) love