Skip to main content
Meros was designed to be the transport layer for GraphQL’s @defer and @stream directives, enabling incremental data delivery for improved perceived performance.

Why use Meros with GraphQL?

GraphQL’s @defer and @stream directives allow servers to send query results incrementally using multipart responses. This reduces time-to-first-byte and improves user experience by showing critical data immediately while loading less important data progressively.
Meros handles the multipart transport protocol so you can focus on your GraphQL implementation.

Understanding @defer and @stream

The @defer directive tells the server to send parts of a query later, prioritizing critical data:
query {
  user {
    id
    name
    ... on User @defer {
      posts {
        title
        content
      }
    }
  }
}
The server immediately returns id and name, then sends the posts data as a separate part when ready.

Consuming deferred/streamed responses

1

Execute your GraphQL query

Send a query with @defer or @stream directives to your GraphQL endpoint:
const query = `
  query {
    user(id: "123") {
      id
      name
      ... @defer {
        posts {
          title
          content
        }
      }
    }
  }
`;

const response = await fetch('/graphql', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({ query }),
});
2

Process parts with Meros

Use Meros to consume the multipart response:
import { meros } from 'meros';

const parts = await meros(response);

for await (const part of parts) {
  if (part.json) {
    // Each part contains a GraphQL response
    const { data, path, errors } = part.body;
    
    // Update your store/cache with the new data
    updateCache(path, data);
  }
}
GraphQL multipart responses follow a specific format with data, path, and optional errors fields.
3

Optimize with multiple mode

Use multiple: true to process all available parts in a chunk synchronously:
const chunks = await meros(response, { multiple: true });

for await (const parts of chunks) {
  // Process multiple parts at once
  for (const part of parts) {
    const { data, path } = part.body;
    updateCache(path, data);
  }
  
  // Commit all updates to your store in one batch
  commitUpdates();
}
This prevents multiple re-renders by batching updates instead of processing parts one at a time.

Integration with GraphQL clients

Meros works seamlessly with popular GraphQL clients:
import { meros } from 'meros';
import { Environment, Network, RecordSource, Store } from 'relay-runtime';

const fetchQuery = async (operation, variables) => {
  const response = await fetch('/graphql', {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({ query: operation.text, variables }),
  });

  const parts = await meros(response);
  return parts;
};

const environment = new Environment({
  network: Network.create(fetchQuery),
  store: new Store(new RecordSource()),
});

Performance optimization

When working with GraphQL and Meros, consider these optimizations:

Use multiple mode for batching

The multiple: true option allows you to commit multiple parts to your store synchronously, reducing the number of re-renders:
const chunks = await meros(response, { multiple: true });

for await (const parts of chunks) {
  // Aggregate all parts before committing
  const updates = parts.map(part => ({ path: part.body.path, data: part.body.data }));
  
  // Single commit instead of multiple
  store.commit(updates);
}

Prioritize critical data

Use @defer strategically to show important content first:
query {
  # Critical: shown immediately
  user {
    id
    name
    avatar
  }
  
  # Deferred: shown when ready
  ... @defer {
    recommendations {
      id
      title
    }
  }
}
Meros is highly performant, processing over 1.2 million operations per second in Node and 800k ops/sec in browsers.

Next steps

Basic usage

Learn the fundamentals of Meros

RxJS integration

Use Meros with reactive streams

Build docs developers (and LLMs) love