Skip to main content

GET /v1/batches

Returns a paginated list of all batches.

Authentication

Requires provider authentication headers:
x-portkey-provider: openai
Authorization: Bearer YOUR_OPENAI_API_KEY

Request

Headers

x-portkey-provider
string
required
The provider to route the request to (e.g., openai)
Authorization
string
required
Bearer token for the provider API

Query Parameters

limit
integer
Number of batches to return (default: 20, max: 100)
after
string
Cursor for pagination - returns batches after this batch ID

Response

data
array
Array of batch objects
object
string
The object type, always “list”
has_more
boolean
Whether there are more results available
first_id
string
ID of the first batch in the list
last_id
string
ID of the last batch in the list

Example

curl https://localhost:8787/v1/batches?limit=10 \
  -H "x-portkey-provider: openai" \
  -H "Authorization: Bearer $OPENAI_API_KEY"

Response Example

{
  "data": [
    {
      "id": "batch_abc123",
      "object": "batch",
      "endpoint": "/v1/chat/completions",
      "errors": null,
      "input_file_id": "file-abc123",
      "completion_window": "24h",
      "status": "completed",
      "output_file_id": "file-xyz789",
      "error_file_id": null,
      "created_at": 1713894800,
      "in_progress_at": 1713894900,
      "completed_at": 1713898500,
      "metadata": {
        "description": "Daily batch job"
      }
    },
    {
      "id": "batch_def456",
      "object": "batch",
      "endpoint": "/v1/embeddings",
      "status": "in_progress",
      "input_file_id": "file-def456",
      "created_at": 1713895000,
      "metadata": {}
    }
  ],
  "object": "list",
  "has_more": false,
  "first_id": "batch_abc123",
  "last_id": "batch_def456"
}

Filtering and Monitoring

Monitor Active Batches

from portkey_ai import Portkey

client = Portkey(
    provider="openai",
    Authorization="sk-***"
)

# Get all active batches
batches = client.batches.list(limit=100)
active = [b for b in batches.data if b.status in ["validating", "in_progress", "finalizing"]]

print(f"Active batches: {len(active)}")
for batch in active:
    print(f"  {batch.id}: {batch.status}")

Find Recent Failures

# Get batches from the last 24 hours that failed
import time
day_ago = int(time.time()) - 86400

batches = client.batches.list(limit=100)
failed = [
    b for b in batches.data 
    if b.status == "failed" and b.created_at > day_ago
]

print(f"Failed batches in last 24h: {len(failed)}")

Best Practices

The API doesn’t support server-side filtering by status or date. Retrieve batches and filter them in your application code.
Regularly poll the list endpoint to monitor batch progress and detect failures early.

Create Batch

Create a new batch

Retrieve Batch

Check specific batch status

Cancel Batch

Cancel a running batch

Build docs developers (and LLMs) love