Batch Processing
Batch endpoints let you submit up to one thousand files in a single API call. Each file is enqueued as an independent run and processed asynchronously — use webhooks to know when the batch completes, then fetch results via the List endpoints.
Batch endpoints are available for parsing and all three document processor types:
When to use batch endpoints
Use batch endpoints when you have many files to process — for example, end-of-day ingestion pipelines, bulk backfills, or any high-volume async workload.
Batch submissions are placed into a delayed queue and given a lower default priority than single runs. This ensures that batch workloads do not interfere with interactive use from the dashboard or single-file API calls.
For low-volume or low-latency use cases, the single-file async endpoints with polling or webhooks are the right choice.
How it works
- Submit — Send a
POSTrequest with aninputsarray (1–1,000 items). For processor batches (extract, classify, split) a processor ID is required; for parse batches, config is optional. The endpoint returns immediately with a batch object inPENDINGstatus. - Process — Each item in
inputsis enqueued as an independent run. The batch transitions throughPENDING → PROCESSING → PROCESSED(orFAILED/CANCELLED). - Consume results — Subscribe to webhooks to be notified when the batch completes, then fetch individual run results via the List endpoints filtered by
batchId.
Inline configuration (config) is not supported for batch processor requests (extract, classify, split). You must reference an existing processor by id.
Batch parsing
Batch extraction
Batch classification
Batch splitting
Raw text input ({ text: "..." }) is not supported for split runs. Provide files as a URL ({ url: "..." }) or an Extend file ID ({ id: "..." }).
Monitoring results
Webhooks
Subscribe to batch completion events to be notified when a batch finishes. The webhook payload includes the batch ID but not individual run results — use the List endpoints (below) to fetch those once notified.
Batch parse run events (for POST /parse_runs/batch):
Batch processor run events (for extract, classify, and split batches — all emit the same event types):
See Webhook Configuration for setup instructions.
Checking batch status
You can poll the GET batch endpoint at any time to check the current status of a batch. The same endpoint works for all batch types — parse, extract, classify, and split. You can identify the type from the batch ID prefix (bpar_ for parse, bpr_ for processor batches):
The batch status field reflects the aggregate state:
Fetching individual results
Once the batch has completed, retrieve individual run results using the List endpoints with a batchId filter:

