Skip to main content

Quickstart Guide

Get up and running with Meridian in less than 5 minutes. This guide walks you through signing up, uploading your first dataset, running natural language queries, and generating AI-powered insights.
Prerequisites: You’ll need a Google account for authentication. No credit card required to get started.

Step 1: Sign Up and Access Dashboard

1

Navigate to Meridian

Go to your Meridian instance URL and click Sign in with Google.
// Authentication is handled by Convex Auth
// Supports Google OAuth out of the box
const { signIn } = useAuthActions()
await signIn("google")
Meridian uses Convex Auth for secure authentication. Your data is scoped to your user account automatically.
2

View the Dashboard

After signing in, you’ll land on the dashboard at /dashboard. You’ll see:
  • Analytics cards: Files uploaded, tables processed, total storage used
  • Recent activity: Your upload and processing history
  • Quick actions: Upload file, search tables, view documentation
The dashboard uses real-time subscriptions. If you upload files in another tab or a teammate uploads data, you’ll see updates instantly.

Step 2: Upload Your First Dataset

Meridian supports two upload methods: file upload and URL extraction.

Upload from Local File

1

Click Upload File

From the dashboard, click the Upload File button in the top right or from the sidebar.
2

Drag and Drop

Drag a CSV, XLSX, or XLS file (max 10MB) into the dropzone or click to browse.
// Supported file types from FileUpload.tsx
accept={[MIME_TYPES.csv, MIME_TYPES.xlsx, MIME_TYPES.xls]}
maxSize={10 * 1024 ** 2} // 10MB limit
Files are uploaded to Cloudflare R2 via Convex’s R2 component for cost-effective storage.
3

Automatic Processing

For CSV files, Meridian automatically:
  1. Uploads to R2 storage
  2. Creates a DuckDB table with inferred schema
  3. Shows a success notification with row count
// From FileUpload.tsx - automatic table creation
const tableName = file.name
  .replace(/\.csv$/i, '')
  .replace(/[^a-zA-Z0-9_]/g, '_')
  .toLowerCase()

const result = await createTableFromCSV({
  csvUrl,
  tableName,
})
// Returns: { tableName: "sales_data", rowCount: 15420 }
Table names are sanitized: Sales-Data-2024.csv becomes sales_data_2024 in DuckDB.

Example: Upload Sales Data

Upload a CSV file named sales_2024.csv:
date,product,revenue,region
2024-01-15,Widget A,1250.00,North
2024-01-16,Widget B,890.50,South
2024-01-17,Widget A,1450.00,East
After upload, you’ll see:
  • CSV Processed: Created DuckDB table “sales_2024” with 3 rows
  • 📊 Table is now queryable at /table/sales_2024

Step 3: Run Your First Query

Now that you have data loaded, let’s query it.
1

Navigate to Table View

From the dashboard, click View Table on any processed file. This takes you to /table/{table_name}.You’ll see:
  • Data table with your rows and columns
  • Query editor at the bottom of the page
  • AI agent sidebar on the right
2

Try a SQL Query

The query editor starts with SELECT * FROM {table_name}. Modify it:
SELECT product, SUM(revenue) as total_revenue
FROM sales_2024
GROUP BY product
ORDER BY total_revenue DESC
Press Ctrl+Enter (Cmd+Enter on Mac) or click Execute to run.
// From QueryEditor.tsx - keyboard shortcut
if ((e.ctrlKey || e.metaKey) && e.key === 'Enter') {
  e.preventDefault()
  onExecute() // Executes the query
}
Queries are executed server-side using DuckDB’s Node API. This means no browser memory limits and 10x faster performance than WASM.
3

View Results

Results appear instantly in the data table. The page shows:
  • Row count: Total rows returned by your query
  • Execution time: Logged in milliseconds
  • Paginated results: 50 rows per page by default
// From table.$table.tsx - query execution
const startTime = Date.now()
const result = await queryDuckDB({ data: query })
const executionTime = Date.now() - startTime

// Metadata is logged to Convex
await logQuery({
  query,
  tableName: table,
  success: true,
  resultMetadata: {
    rowCount: parsed.rows?.length,
    columnCount: parsed.columns?.length,
    executionTimeMs: executionTime,
  },
})
All queries are logged with timestamps for full reproducibility.

Step 4: Use Natural Language with AI Agent

Skip SQL and ask questions in plain English.
1

Open Agent Panel

On the table page, the right sidebar shows the AI Agent panel. It has two modes:
  • Query Mode: Generates SQL queries from natural language
  • Analysis Mode: Answers questions using tools (queries, charts, insights)
// From table_agent.ts - two specialized agents
const query_agent = new Agent(components.agent, {
  name: 'Query Agent',
  instructions: `You write DuckDB SQL queries...`,
  tools: { queryDuckDB, getTableSchema, getSampleRows, ... }
})

const analysis_agent = new Agent(components.agent, {
  name: 'Analysis Agent',
  instructions: `You explore and analyze databases...`,
  tools: { queryDuckDB, createChart, generateInsights, ... }
})
2

Ask a Question (Query Mode)

Select Query Mode and type:
Show me the top 5 products by revenue
Click Execute or press Enter. The agent will:
  1. Analyze your table schema and sample rows
  2. Generate a SQL query
  3. Stream its reasoning: “Reading columns… Computing aggregates… Sorting results…”
  4. Return commands you can execute
// Agent response format
{
  "commands": [
    "SELECT product, SUM(revenue) as total_revenue FROM sales_2024 GROUP BY product ORDER BY total_revenue DESC LIMIT 5"
  ],
  "description": "This query aggregates revenue by product and returns the top 5 highest earners"
}
The agent uses Gemini 2.5 Flash for fast, cost-effective generation. Queries stream in real-time so you see progress.
3

Review and Execute

The generated query appears in the query editor. You can:
  • Edit it before running
  • Click Execute to run it
  • See the agent’s reasoning in the sidebar
If the agent generates multiple commands (e.g., for complex multi-step operations), they’re queued:
// From QueryEditor.tsx - command queue display
{commandQueue.length > 0 && (
  <Badge size="xs">
    {currentCommandIndex + 1} / {commandQueue.length}
  </Badge>
)}
Execute each command in sequence. Results update live.
4

Try Analysis Mode

Switch to Analysis Mode and ask:
What are the trends in this data?
The analysis agent will:
  • Query the table to get data
  • Generate statistical insights
  • Create charts automatically
  • Stream its full reasoning with tool calls
// From table_agent.ts - analysis mode with tools
for await (const st_part of stream.fullStream) {
  if (st_part.type === 'tool-call') {
    // Show tool being called (e.g., "queryDuckDB", "createChart")
  }
  if (st_part.type === 'tool-result') {
    // Show tool result (e.g., "Found 1,523 rows")
  }
  if (st_part.type === 'text-delta') {
    // Stream agent's natural language explanation
  }
}
Analysis mode is perfect for exploratory work. The agent uses tools like queryDuckDB, createChart, generateInsights, and even firecrawlSearch to answer questions.

Step 5: Generate Insights

Meridian automatically analyzes your data for patterns, outliers, and trends.
1

Open Insights Panel

On the table page, click the Insights tab in the right sidebar.
2

Click Generate Insights

Click Generate Insights to trigger AI analysis. Meridian will:
  1. Run statistical queries using DuckDB (nulls, outliers, correlations)
  2. Send findings to Gemini AI for interpretation
  3. Return actionable insights with severity ratings
// From table.$table.tsx - insight generation
const statisticalAnalyses = await analyzeTableWithDuckDB(
  table,
  query,
  dataToAnalyze.columns,
)

const result = await generateInsights({
  tableName: table,
  query: query,
  statisticalAnalyses: statisticalAnalyses,
  rowCount: dataToAnalyze.rows.length,
  columnCount: dataToAnalyze.columns.length,
})
3

Review Insights

Insights appear as cards with:
  • Title: Brief summary (e.g., “High revenue variance detected”)
  • Description: Detailed explanation
  • Type: outlier, trend, aggregation, pattern, anomaly
  • Severity: low, medium, high
// From InsightsPanel.tsx - insight structure
interface Insight {
  title: string
  description: string
  type: 'outlier' | 'trend' | 'aggregation' | 'pattern' | 'anomaly'
  severity: 'low' | 'medium' | 'high'
}
Example insight:
{
  "title": "Revenue spike in Q4",
  "description": "Revenue increased 340% in October-December compared to previous quarters, driven primarily by Widget A sales in the North region",
  "type": "trend",
  "severity": "high"
}
Insights are cached per query. Click Refresh to regenerate with new AI analysis. Cache is scoped to user + table + query hash.

Step 6: View Live Statistics

Switch to the Statistics tab to see auto-generated statistical findings.
// From table.$table.tsx - automatic statistical analysis
useEffect(() => {
  if (data && data.rows.length > 0) {
    const statisticalAnalyses = await analyzeTableWithDuckDB(
      table,
      'SELECT * FROM ' + table,
      data.columns,
    )
    setStatisticalFindings(statisticalAnalyses)
  }
}, [data])
You’ll see:
  • Column summaries: Min, max, mean, median, stddev for numeric columns
  • Null counts: Percentage of missing values per column
  • Data quality: Duplicate detection, outlier identification
  • Distributions: Histograms and frequency counts
Statistical findings update automatically when you execute queries. They’re computed server-side using DuckDB’s analytical functions.

Step 7: Create Charts with AI

Use the analysis agent to create interactive, draggable charts.
1

Ask for a Chart

In Analysis Mode, type:
Create a bar chart showing revenue by product
The agent will:
  1. Query the data
  2. Determine appropriate chart type
  3. Configure axes and series
  4. Return a chart config
// From agent_tools.ts - createChart tool
createChart: {
  description: "Create an interactive chart from query results",
  parameters: z.object({
    chartType: z.enum(['bar', 'line', 'area', 'pie', 'donut', 'scatter']),
    title: z.string(),
    query: z.string(),
    xAxis: z.string(),
    yAxis: z.string(),
  }),
  execute: async ({ chartType, title, query, xAxis, yAxis }) => {
    // Execute query and build chart config
  },
}
2

View in Charts Tab

Click the Charts tab. Your chart appears on a draggable canvas.You can:
  • Drag charts by clicking the header
  • Remove charts with the X button
  • See live updates when data changes
// From ChartCanvas.tsx - draggable implementation
const handlePointerDown = (e: React.PointerEvent) => {
  // Capture pointer and canvas rect
  dragOffsetRef.current = {
    x: e.clientX - canvasLeft - position.x,
    y: e.clientY - canvasTop - position.y,
  }
  setIsDragging(true)
}
Charts automatically re-execute their queries when you modify data. If you UPDATE a row, all charts refresh with new data.

Step 8: Collaborate in Real-Time

Meridian’s killer feature: live collaboration.
1

Watch Live Notifications

On the table page, live notifications appear when teammates:
  • Execute queries
  • Generate insights
  • Create charts
  • Ask AI agents questions
// From TableNotifications.tsx - real-time subscription
const notifications = useQuery(api.notifications.getNotifications, {
  tableName,
})

// Broadcasts appear as toasts
notifications.show({
  title: `${userName} executed a query`,
  message: queryPreview,
})
2

See Query History

Click the History tab in the sidebar to see:
  • All queries executed on this table
  • Who ran them and when
  • Success/failure status
  • Execution times
// From schema.ts - queryLog table
queryLog: defineTable({
  query: v.string(),
  executedAt: v.number(),
  userId: v.string(),
  tableName: v.string(),
  success: v.boolean(),
  resultMetadata: v.optional(v.object({
    rowCount: v.optional(v.number()),
    executionTimeMs: v.optional(v.number()),
  })),
})
Click any query to see its full details and results.
3

Rollback to Previous State

From the history panel, click Rollback on any query to restore the table to that point in time.
Rollback is experimental. It replays all queries up to the selected point. For large tables, this can take time.

What’s Next?

You’ve completed the quickstart! You now know how to: ✅ Upload data (CSV or URL extraction)
✅ Run SQL queries with the editor
✅ Use natural language with AI agents
✅ Generate automatic insights
✅ View statistical findings
✅ Create interactive charts
✅ Collaborate in real-time

Explore Key Features

Deep dive into all 10 core features with advanced examples

API Reference

Learn about the query API, agent tools, and chart configurations

Troubleshooting

Meridian has a 10MB limit per file. For larger datasets:
  1. Split the CSV into chunks
  2. Upload to R2 externally and provide a URL
  3. Contact support for enterprise limits
Server-side queries have a timeout. For very large tables:
  1. Add WHERE clauses to filter data
  2. Use LIMIT to restrict result size
  3. Create aggregated views for common queries
The AI agent occasionally generates syntax errors. To fix:
  1. Edit the query in the query editor
  2. Check table schema for correct column names
  3. Provide more context in your prompt (“use the revenue column, not price”)
If insights fail:
  1. Ensure your query returns data (at least 1 row)
  2. Check that Gemini API key is configured
  3. Try clicking Refresh to bypass cache
For more help, use Ctrl+K (Cmd+K) to open the command palette and search tables, files, or quick actions.

Build docs developers (and LLMs) love