Skip to main content

Overview

Many Roblox APIs have limits on array parameters (e.g., max 100 universe IDs per request). RoZod’s fetchApiSplit automatically splits large requests into smaller batches.

Basic usage

Use fetchApiSplit to process arrays that exceed API limits:
import { fetchApiSplit } from 'rozod';
import { getGamesIcons } from 'rozod/lib/endpoints/gamesv1';

// Will automatically split into batches of 100 universeIds per request
const data = await fetchApiSplit(
  getGamesIcons,
  { universeIds: [1, 2, 3, 4, 5, /* ...hundreds more IDs */] },
  { universeIds: 100 } // Max 100 IDs per request
);

console.log(data);
RoZod sends multiple requests in parallel and combines the results into a single array.

How it works

1

Define your large input

Start with an array parameter that might exceed API limits:
const universeIds = Array.from({ length: 500 }, (_, i) => i + 1);
2

Specify the batch size

Tell RoZod the maximum items per request:
const batchSize = { universeIds: 100 };
3

RoZod splits and fetches

RoZod automatically:
  • Splits your 500 IDs into 5 batches of 100
  • Sends 5 parallel requests
  • Combines the results
const results = await fetchApiSplit(
  getGamesIcons,
  { universeIds },
  batchSize
);
4

Process the combined results

Results are returned as an array of responses:
console.log(results); // Array of 5 response objects

Complete example

Here’s a real-world example fetching icons for many games:
import { fetchApiSplit, isAnyErrorResponse } from 'rozod';
import { getGamesIcons } from 'rozod/lib/endpoints/gamesv1';

async function getIconsForManyGames(universeIds: number[]) {
  const results = await fetchApiSplit(
    getGamesIcons,
    { universeIds },
    { universeIds: 100 } // Roblox limit is 100 per request
  );

  if (isAnyErrorResponse(results)) {
    console.error('Failed to fetch icons:', results.message);
    return [];
  }

  // Flatten the array of responses into a single array
  return results.flatMap(response => response.data);
}

const universeIds = Array.from({ length: 500 }, (_, i) => i + 1);
const icons = await getIconsForManyGames(universeIds);

console.log(`Fetched ${icons.length} game icons`);

Transforming results

You can transform each batch’s response before combining:
import { fetchApiSplit } from 'rozod';
import { getGamesIcons } from 'rozod/lib/endpoints/gamesv1';

const data = await fetchApiSplit(
  getGamesIcons,
  { universeIds: [1, 2, 3, /* ...many more */] },
  { universeIds: 100 },
  // Transform function extracts just the data array
  (response) => response.data
);

// data is now an array of data arrays, not full responses
console.log(data);
The transform function is useful for extracting nested data or filtering results.

Advanced examples

Split on multiple array parameters:
import { fetchApiSplit } from 'rozod';
import { getUsersUserdetails } from 'rozod/lib/endpoints/usersv1';

const results = await fetchApiSplit(
  getUsersUserdetails,
  { 
    userIds: Array.from({ length: 200 }, (_, i) => i + 1)
  },
  { userIds: 50 } // Max 50 users per request
);

console.log(results);

Error handling

If any batch fails, the entire operation returns an error:
import { fetchApiSplit, isAnyErrorResponse } from 'rozod';
import { getGamesIcons } from 'rozod/lib/endpoints/gamesv1';

const results = await fetchApiSplit(
  getGamesIcons,
  { universeIds: [1, 2, 3, /* ...many more */] },
  { universeIds: 100 }
);

if (isAnyErrorResponse(results)) {
  console.error('Batch processing failed:', results.message);
  // All batches are aborted on first error
} else {
  console.log('All batches completed successfully');
  console.log(`Received ${results.length} batch responses`);
}
fetchApiSplit stops on the first error. If you need to process all batches even when some fail, implement your own batching logic.

Performance considerations

Parallel requests

fetchApiSplit sends all batches in parallel using Promise.all:
// This sends 10 requests simultaneously
const results = await fetchApiSplit(
  getGamesIcons,
  { universeIds: Array.from({ length: 1000 }, (_, i) => i) },
  { universeIds: 100 }
);
Parallel requests are fast but may hit rate limits. Consider using sequential processing for large batches.

Rate limiting

For very large datasets, implement your own batching with delays:
import { fetchApi } from 'rozod';
import { getGamesIcons } from 'rozod/lib/endpoints/gamesv1';

async function fetchWithDelay(universeIds: number[], batchSize: number) {
  const results = [];
  
  for (let i = 0; i < universeIds.length; i += batchSize) {
    const batch = universeIds.slice(i, i + batchSize);
    const response = await fetchApi(getGamesIcons, { universeIds: batch });
    
    if (!isAnyErrorResponse(response)) {
      results.push(response);
    }
    
    // Wait 100ms between batches to avoid rate limiting
    if (i + batchSize < universeIds.length) {
      await new Promise(resolve => setTimeout(resolve, 100));
    }
  }
  
  return results;
}

When to use batch processing

Use fetchApiSplit when

  • You have large input arrays
  • The API has parameter limits
  • All requests can run in parallel
  • You want automatic error handling

Use manual batching when

  • You need sequential processing
  • You want custom rate limiting
  • You need to handle partial failures
  • You need progress tracking

Batch processing vs pagination

Don’t confuse batch processing with pagination:
  • Batch processing: Splits your input into smaller chunks
  • Pagination: Fetches multiple pages of output from the API
// Batch processing: many inputs → many requests
await fetchApiSplit(
  getGamesIcons,
  { universeIds: [1,2,3,...1000] }, // Large input
  { universeIds: 100 }
);

// Pagination: one input → many pages of output
await fetchApiPages(
  getGroupsGroupidWallPosts,
  { groupId: 11479637 } // Single input, many result pages
);
See the pagination guide for handling multi-page API responses.

Next steps

Pagination

Learn about handling paginated responses

Custom endpoints

Define your own batch-enabled endpoints

Build docs developers (and LLMs) love