Skip to main content
POST
/
v1
/
batches
/
{batch_id}
/
cancel
from openai import OpenAI
client = OpenAI()

batch = client.batches.cancel("batch_abc123")
print(batch.status)  # "cancelling"
{
  "id": "batch_abc123",
  "object": "batch",
  "endpoint": "/v1/chat/completions",
  "input_file_id": "file-abc123",
  "completion_window": "24h",
  "status": "cancelling",
  "created_at": 1711073447,
  "in_progress_at": 1711073448,
  "cancelling_at": 1711074447,
  "request_counts": {
    "total": 100,
    "completed": 45,
    "failed": 2
  }
}
Cancels an in-progress batch. The batch will be in status cancelling for up to 10 minutes, before changing to cancelled, where it will have partial results (if any) available in the output file.

Path Parameters

batch_id
string
required
The ID of the batch to cancel.

Response

Returns the Batch object with status cancelling.
id
string
The batch ID.
status
string
Will be cancelling immediately after the request, then cancelled once cancellation is complete.
cancelling_at
integer
The Unix timestamp (in seconds) for when the batch started cancelling.
from openai import OpenAI
client = OpenAI()

batch = client.batches.cancel("batch_abc123")
print(batch.status)  # "cancelling"
{
  "id": "batch_abc123",
  "object": "batch",
  "endpoint": "/v1/chat/completions",
  "input_file_id": "file-abc123",
  "completion_window": "24h",
  "status": "cancelling",
  "created_at": 1711073447,
  "in_progress_at": 1711073448,
  "cancelling_at": 1711074447,
  "request_counts": {
    "total": 100,
    "completed": 45,
    "failed": 2
  }
}

Build docs developers (and LLMs) love