Skip to main content
POST
/
v1
/
batches
from openai import OpenAI
client = OpenAI()

batch = client.batches.create(
    input_file_id="file-abc123",
    endpoint="/v1/chat/completions",
    completion_window="24h",
    metadata={"description": "nightly eval job"}
)
{
  "id": "batch_abc123",
  "object": "batch",
  "endpoint": "/v1/chat/completions",
  "input_file_id": "file-abc123",
  "completion_window": "24h",
  "status": "validating",
  "created_at": 1711073447,
  "request_counts": {
    "total": 0,
    "completed": 0,
    "failed": 0
  },
  "metadata": {
    "description": "nightly eval job"
  }
}
Creates and executes a batch from an uploaded file of requests.

Request Body

completion_window
string
required
The time frame within which the batch should be processed. Currently only 24h is supported.
endpoint
string
required
The endpoint to be used for all requests in the batch. Currently supported:
  • /v1/responses
  • /v1/chat/completions
  • /v1/embeddings
  • /v1/completions
  • /v1/moderations
  • /v1/images/generations
  • /v1/images/edits
Note: /v1/embeddings batches are restricted to a maximum of 50,000 embedding inputs.
input_file_id
string
required
The ID of an uploaded file that contains requests for the new batch.Your input file must be:
  • Formatted as a JSONL file
  • Uploaded with the purpose batch
  • Up to 50,000 requests
  • Up to 200 MB in size
metadata
object
Set of 16 key-value pairs that can be attached to an object. This can be useful for storing additional information about the object in a structured format.Keys are strings with a maximum length of 64 characters. Values are strings with a maximum length of 512 characters.
output_expires_after
object
The expiration policy for the output and/or error file that are generated for a batch.

Response

Returns a Batch object.
id
string
The batch ID.
object
string
The object type, always batch.
endpoint
string
The OpenAI API endpoint used by the batch.
input_file_id
string
The ID of the input file for the batch.
completion_window
string
The time frame within which the batch should be processed.
status
string
The current status of the batch. One of:
  • validating
  • failed
  • in_progress
  • finalizing
  • completed
  • expired
  • cancelling
  • cancelled
output_file_id
string
The ID of the file containing the outputs of successfully executed requests.
error_file_id
string
The ID of the file containing the outputs of requests with errors.
created_at
integer
The Unix timestamp (in seconds) for when the batch was created.
metadata
object
Set of 16 key-value pairs attached to the object.
from openai import OpenAI
client = OpenAI()

batch = client.batches.create(
    input_file_id="file-abc123",
    endpoint="/v1/chat/completions",
    completion_window="24h",
    metadata={"description": "nightly eval job"}
)
{
  "id": "batch_abc123",
  "object": "batch",
  "endpoint": "/v1/chat/completions",
  "input_file_id": "file-abc123",
  "completion_window": "24h",
  "status": "validating",
  "created_at": 1711073447,
  "request_counts": {
    "total": 0,
    "completed": 0,
    "failed": 0
  },
  "metadata": {
    "description": "nightly eval job"
  }
}

Build docs developers (and LLMs) love