Skip to main content
After building a durable AI agent, you already get UI message chunks for displaying tool invocations and return values. However, for long-running steps, you may want to show progress updates or stream step output to the user while it’s being generated. Workflow DevKit enables this by letting step functions write custom chunks to the same stream the agent uses. These chunks appear as data parts in your messages, which you can render however you like.

Overview

Data parts allow tools to stream arbitrary data to the client during execution:
  • Progress updates: Show percentage complete, current step, estimated time remaining
  • Intermediate results: Display results as they’re found (e.g., search results)
  • Status changes: Notify users when entering different phases of execution
  • Rich media: Stream images, files, or other content incrementally

Basic Example

Let’s extend a flight search tool to emit progress updates:
1

Define Your Data Part Type

First, define a TypeScript type for your custom data part:
types/chat.ts
export interface FoundFlightDataPart {
  type: "data-found-flight"; // Must start with "data-"
  id: string;
  data: {
    flightNumber: string;
    airline: string;
    from: string;
    to: string;
    price: number;
  };
}
The type field must be a string starting with data- followed by your custom identifier. The id field should match the toolCallId so the client can associate the data with the correct tool invocation.
2

Emit Updates from Your Tool

Use getWritable() inside a step function to write to the stream:
workflows/chat/steps/tools.ts
import { getWritable } from "workflow";
import type { UIMessageChunk } from "ai";

export async function searchFlights(
  { from, to, date }: { from: string; to: string; date: string },
  { toolCallId }: { toolCallId: string }
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();

  try {
    // Generate flights (simulated)
    const flights = await generateFlights(from, to, date);

    // Emit each flight as it's found
    for (const flight of flights) {
      await writer.write({
        type: "data-found-flight",
        id: `${toolCallId}-${flight.flightNumber}`,
        data: flight,
      });

      // Simulate processing time
      await new Promise((resolve) => setTimeout(resolve, 1000));
    }

    return {
      message: `Found ${flights.length} flights from ${from} to ${to}`,
      count: flights.length,
    };
  } finally {
    writer.releaseLock();
  }
}
Key points:
  • Call getWritable<UIMessageChunk>() to get the stream
  • Use getWriter() to acquire a writer
  • Write objects with type, id, and data fields
  • Always call releaseLock() when done (use try/finally)
3

Handle Data Parts in the Client

Update your chat component to detect and render the custom data parts:
app/page.tsx
export default function ChatPage() {
  const { messages } = useChat();

  return (
    <div>
      {messages.map((message) => (
        <div key={message.id}>
          {message.parts.map((part, partIndex) => {
            // Render text parts
            if (part.type === "text") {
              return <div key={partIndex}>{part.text}</div>;
            }

            // Render streaming flight data parts
            if (part.type === "data-found-flight") {
              const flight = part.data as {
                flightNumber: string;
                airline: string;
                from: string;
                to: string;
                price: number;
              };

              return (
                <div
                  key={part.id}
                  className="border rounded-lg p-3 bg-muted"
                >
                  <div className="font-medium">
                    {flight.airline} - {flight.flightNumber}
                  </div>
                  <div className="text-sm text-muted-foreground">
                    {flight.from} → {flight.to}
                  </div>
                  <div className="text-sm font-medium">
                    ${flight.price}
                  </div>
                </div>
              );
            }

            return null;
          })}
        </div>
      ))}
    </div>
  );
}
Now when the agent searches for flights, you’ll see each result appear one by one in real-time.

Advanced Patterns

Progress Indicators

Show percentage-based progress for long-running operations:
workflows/chat/steps/tools.ts
import { getWritable } from "workflow";
import type { UIMessageChunk } from "ai";

async function processLargeDataset(
  { datasetId }: { datasetId: string },
  { toolCallId }: { toolCallId: string }
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();

  try {
    const totalItems = await getDatasetSize(datasetId);
    let processed = 0;

    // Process items in batches
    for await (const batch of fetchBatches(datasetId)) {
      await processBatch(batch);
      processed += batch.length;

      // Emit progress update
      await writer.write({
        type: "data-progress",
        id: toolCallId,
        data: {
          current: processed,
          total: totalItems,
          percentage: Math.round((processed / totalItems) * 100),
        },
      });
    }

    return { processed, total: totalItems };
  } finally {
    writer.releaseLock();
  }
}
Render the progress bar:
app/page.tsx
if (part.type === "data-progress") {
  const { current, total, percentage } = part.data as {
    current: number;
    total: number;
    percentage: number;
  };

  return (
    <div key={part.id} className="space-y-2">
      <div className="flex justify-between text-sm">
        <span>Processing...</span>
        <span>
          {current} / {total}
        </span>
      </div>
      <div className="w-full bg-gray-200 rounded-full h-2">
        <div
          className="bg-blue-600 h-2 rounded-full transition-all"
          style={{ width: `${percentage}%` }}
        />
      </div>
    </div>
  );
}

Multi-Stage Progress

Track progress through multiple stages:
workflows/chat/steps/tools.ts
async function buildProject(
  { projectId }: { projectId: string },
  { toolCallId }: { toolCallId: string }
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();

  const stages = [
    { name: "Installing dependencies", fn: () => installDependencies(projectId) },
    { name: "Running tests", fn: () => runTests(projectId) },
    { name: "Building application", fn: () => buildApp(projectId) },
    { name: "Deploying", fn: () => deploy(projectId) },
  ];

  try {
    for (let i = 0; i < stages.length; i++) {
      const stage = stages[i];

      // Emit stage start
      await writer.write({
        type: "data-build-stage",
        id: `${toolCallId}-stage-${i}`,
        data: {
          stage: i + 1,
          total: stages.length,
          name: stage.name,
          status: "running",
        },
      });

      // Execute stage
      await stage.fn();

      // Emit stage complete
      await writer.write({
        type: "data-build-stage",
        id: `${toolCallId}-stage-${i}`,
        data: {
          stage: i + 1,
          total: stages.length,
          name: stage.name,
          status: "complete",
        },
      });
    }

    return { success: true, stages: stages.length };
  } finally {
    writer.releaseLock();
  }
}

Streaming Rich Content

Stream images or files as they’re generated:
workflows/chat/steps/tools.ts
async function generateImages(
  { prompt, count }: { prompt: string; count: number },
  { toolCallId }: { toolCallId: string }
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();

  try {
    const images = [];

    for (let i = 0; i < count; i++) {
      // Generate image
      const imageUrl = await generateImage(prompt);
      images.push(imageUrl);

      // Stream image immediately
      await writer.write({
        type: "data-generated-image",
        id: `${toolCallId}-image-${i}`,
        data: {
          url: imageUrl,
          index: i + 1,
          total: count,
        },
      });
    }

    return { images, count };
  } finally {
    writer.releaseLock();
  }
}

Error States

Stream error information without throwing:
workflows/chat/steps/tools.ts
async function validateData(
  { data }: { data: any[] },
  { toolCallId }: { toolCallId: string }
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();

  try {
    const errors: any[] = [];

    for (let i = 0; i < data.length; i++) {
      const item = data[i];
      const validation = validateItem(item);

      if (!validation.valid) {
        errors.push({ index: i, errors: validation.errors });

        // Stream error immediately
        await writer.write({
          type: "data-validation-error",
          id: `${toolCallId}-error-${i}`,
          data: {
            index: i,
            errors: validation.errors,
          },
        });
      }
    }

    return {
      valid: errors.length === 0,
      errorCount: errors.length,
      totalItems: data.length,
    };
  } finally {
    writer.releaseLock();
  }
}

Reusable Stream Helpers

Create reusable helpers for common streaming patterns:
workflows/chat/utils/stream-helpers.ts
import { getWritable } from "workflow";
import type { UIMessageChunk } from "ai";

export async function writeProgress(
  toolCallId: string,
  current: number,
  total: number
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();
  try {
    await writer.write({
      type: "data-progress",
      id: toolCallId,
      data: {
        current,
        total,
        percentage: Math.round((current / total) * 100),
      },
    });
  } finally {
    writer.releaseLock();
  }
}

export async function writeStatus(
  toolCallId: string,
  status: string,
  details?: any
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();
  try {
    await writer.write({
      type: "data-status",
      id: toolCallId,
      data: { status, details, timestamp: Date.now() },
    });
  } finally {
    writer.releaseLock();
  }
}

export async function writeResult<T>(
  toolCallId: string,
  result: T,
  index?: number
) {
  "use step";

  const writable = getWritable<UIMessageChunk>();
  const writer = writable.getWriter();
  try {
    await writer.write({
      type: "data-result",
      id: index !== undefined ? `${toolCallId}-${index}` : toolCallId,
      data: result,
    });
  } finally {
    writer.releaseLock();
  }
}
Use the helpers in your tools:
workflows/chat/steps/tools.ts
import { writeProgress, writeResult } from "../utils/stream-helpers";

async function searchFlights(
  { from, to }: { from: string; to: string },
  { toolCallId }: { toolCallId: string }
) {
  "use step";

  const flights = await fetchFlights(from, to);

  for (let i = 0; i < flights.length; i++) {
    await writeProgress(toolCallId, i + 1, flights.length);
    await writeResult(toolCallId, flights[i], i);
  }

  return { count: flights.length };
}

Best Practices

Always Release Locks

Use try/finally to ensure locks are released:
lineNumbers
const writer = writable.getWriter();
try {
  await writer.write(data);
} finally {
  writer.releaseLock(); // Always called, even on error
}

Batch Updates

Don’t stream too frequently - batch updates for better performance:
lineNumbers
// ❌ Too frequent
for (let i = 0; i < 10000; i++) {
  await writeProgress(toolCallId, i, 10000);
}

// ✅ Batched
for (let i = 0; i < 10000; i++) {
  if (i % 100 === 0) {
    await writeProgress(toolCallId, i, 10000);
  }
}

Type Safety

Define types for your data parts:
lineNumbers
interface ProgressDataPart {
  type: "data-progress";
  id: string;
  data: {
    current: number;
    total: number;
    percentage: number;
  };
}

// Use in both server and client
const part: ProgressDataPart = {
  type: "data-progress",
  id: toolCallId,
  data: { current: 50, total: 100, percentage: 50 },
};

Build docs developers (and LLMs) love