Batch API
The Batch API enables you to send multiple messages to your AI agents asynchronously. This is ideal for processing large volumes of requests efficiently without maintaining persistent connections.
Overview
The Batch API allows you to:
- Submit multiple requests in a single API call
- Process requests asynchronously in the background
- Track batch progress and retrieve results
- Receive webhook notifications when batches complete
- Cancel in-progress batches
Create Batch
Submit a batch of requests for asynchronous processing.
Endpoint: POST /api/v1/batches
Request Body
| Field | Type | Required | Description |
|---|---|---|---|
agent_id |
UUID | Yes | Default agent ID for all requests (must have a published version) |
requests |
array | Yes | List of requests to process (1-1000 items) |
webhook |
object | No | Webhook configuration for notifications |
Request Item Schema
Each item in the requests array:
| Field | Type | Required | Description |
|---|---|---|---|
custom_id |
string | Yes | Your correlation key (max 256 characters, must be unique within batch) |
message |
string | Yes | The message to send |
agent_id |
UUID | No | Override agent for this specific request |
external_user_id |
string | No | External user identifier |
name |
string | No | Name for the created chat (max 256 characters) |
attached_file_uuids |
array | No | List of file attachment UUIDs |
history_id |
integer | No | Existing chat history ID to continue conversation |
Webhook Configuration
| Field | Type | Required | Description |
|---|---|---|---|
url |
string | Yes | Webhook URL (must be HTTPS) |
events |
array | No | Events to notify: completed, failed, cancelled (default: ["completed", "failed"]) |
secret |
string | No | Secret for signing webhook payloads (max 255 characters) |
Example
curl -X POST "https://api.codeer.ai/api/v1/batches" \
-H "x-api-key: YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"agent_id": "550e8400-e29b-41d4-a716-446655440000",
"requests": [
{
"custom_id": "req-001",
"message": "What are your business hours?",
"external_user_id": "user_123"
},
{
"custom_id": "req-002",
"message": "How do I return a product?",
"external_user_id": "user_456"
}
],
"webhook": {
"url": "https://your-server.com/webhooks/codeer",
"events": ["completed", "failed"],
"secret": "your-webhook-secret"
}
}'
Response
Status: 202 Accepted
{
"data": {
"id": "batch-uuid-here",
"status": "in_progress",
"agent_id": "550e8400-e29b-41d4-a716-446655440000",
"total_requests": 2,
"completed_requests": 0,
"failed_requests": 0,
"created_at": "2024-01-15T10:30:00Z",
"completed_at": null,
"cancelled_at": null
},
"message": null,
"error_code": 0
}
Agent Must Be Published
Both the default agent_id and any per-request agent_id overrides must reference agents that have been published. Requests with unpublished agents will be rejected.
List Batches
Retrieve batches with pagination and optional status filtering.
Endpoint: GET /api/v1/batches
Query Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
limit |
integer | 50 | Results per page (max 1000) |
offset |
integer | 0 | Number of results to skip |
status |
string | - | Filter by status: in_progress, completed, failed, cancelling, cancelled |
Example
curl -X GET "https://api.codeer.ai/api/v1/batches?limit=10&status=completed" \
-H "x-api-key: YOUR_API_KEY"
Response
{
"data": [
{
"id": "batch-uuid-here",
"status": "completed",
"agent_id": "550e8400-e29b-41d4-a716-446655440000",
"total_requests": 100,
"completed_requests": 98,
"failed_requests": 2,
"created_at": "2024-01-15T10:30:00Z",
"completed_at": "2024-01-15T10:45:00Z",
"cancelled_at": null
}
],
"pagination": {
"limit": 10,
"offset": 0,
"total_records": 50,
"total_pages": 5,
"current_page": 1,
"next_page": "https://api.codeer.ai/api/v1/batches?offset=10&limit=10",
"prev_page": null
},
"message": null,
"error_code": 0
}
Get Batch
Retrieve the current status of a specific batch.
Endpoint: GET /api/v1/batches/{batch_id}
Path Parameters
| Parameter | Type | Description |
|---|---|---|
batch_id |
UUID | The batch ID |
Example
curl -X GET "https://api.codeer.ai/api/v1/batches/batch-uuid-here" \
-H "x-api-key: YOUR_API_KEY"
Response
{
"data": {
"id": "batch-uuid-here",
"status": "in_progress",
"agent_id": "550e8400-e29b-41d4-a716-446655440000",
"total_requests": 100,
"completed_requests": 45,
"failed_requests": 2,
"created_at": "2024-01-15T10:30:00Z",
"completed_at": null,
"cancelled_at": null
},
"message": null,
"error_code": 0
}
Batch Status Values
| Status | Description |
|---|---|
in_progress |
Batch is being processed |
completed |
All requests have been processed |
failed |
Batch processing failed |
cancelling |
Cancellation is in progress |
cancelled |
Batch was cancelled |
Get Batch Results
Retrieve the results of individual requests within a batch.
Endpoint: GET /api/v1/batches/{batch_id}/results
Path Parameters
| Parameter | Type | Description |
|---|---|---|
batch_id |
UUID | The batch ID |
Query Parameters
| Parameter | Type | Default | Description |
|---|---|---|---|
limit |
integer | 50 | Results per page (max 1000) |
offset |
integer | 0 | Number of results to skip |
status |
string | - | Filter by status: pending, processing, success, failed |
Example
curl -X GET "https://api.codeer.ai/api/v1/batches/batch-uuid-here/results?status=success" \
-H "x-api-key: YOUR_API_KEY"
Response
{
"data": [
{
"custom_id": "req-001",
"status": "success",
"response_content": "Our business hours are Monday to Friday, 9 AM to 6 PM EST.",
"response_usage": {
"tool_calls": [
{
"model": "gemini-2.5-flash",
"call_type": "tool_call",
"total_tokens": 195,
"prompt_tokens": 150,
"completion_tokens": 45
}
],
"total_calls": 1,
"total_tokens": 195,
"main_response": null,
"total_prompt_tokens": 150,
"total_completion_tokens": 45
},
"error_code": null,
"error_message": null,
"history_id": 12345,
"conversation_id": 6789,
"processed_at": "2024-01-15T10:31:00Z"
},
{
"custom_id": "req-002",
"status": "failed",
"response_content": null,
"response_usage": null,
"error_code": 500,
"error_message": "Internal server error during processing",
"history_id": null,
"conversation_id": null,
"processed_at": "2024-01-15T10:31:05Z"
}
],
"pagination": { ... },
"message": null,
"error_code": 0
}
Request Status Values
| Status | Description |
|---|---|
pending |
Request is queued for processing |
processing |
Request is currently being processed |
success |
Request completed successfully |
failed |
Request failed |
Error Codes in Results
Error codes in results follow HTTP semantics:
| Error Code | Description |
|---|---|
400 |
Validation error (bad request data) |
408 |
Batch processing timed out before request was processed |
499 |
Batch was cancelled |
500 |
Internal server error during processing |
Cancel Batch
Cancel an in-progress batch. Pending requests will be cancelled, while requests that are already processing will complete.
Endpoint: POST /api/v1/batches/{batch_id}/cancel
Path Parameters
| Parameter | Type | Description |
|---|---|---|
batch_id |
UUID | The batch ID |
Example
curl -X POST "https://api.codeer.ai/api/v1/batches/batch-uuid-here/cancel" \
-H "x-api-key: YOUR_API_KEY"
Response
Status: 202 Accepted
{
"data": {
"id": "batch-uuid-here",
"status": "cancelling",
"agent_id": "550e8400-e29b-41d4-a716-446655440000",
"total_requests": 100,
"completed_requests": 45,
"failed_requests": 2,
"created_at": "2024-01-15T10:30:00Z",
"completed_at": null,
"cancelled_at": null
},
"message": null,
"error_code": 0
}
Cancellation Behavior
- Only batches with status
in_progresscan be cancelled - Requests already in
processingstate will complete normally - Pending requests will be marked as failed with error code
499 - The batch status will transition from
cancellingtocancelledonce all in-flight requests complete
Webhooks
When configured, webhooks are sent as HTTP POST requests to your specified URL.
Webhook Payload
{
"event": "batch.completed",
"timestamp": "2024-01-15T10:45:00Z",
"data": {
"id": "batch-uuid-here",
"status": "completed",
"total_requests": 100,
"completed_requests": 98,
"failed_requests": 2
}
}
Webhook Signature
If you provided a secret in the webhook configuration, the payload is signed using HMAC-SHA256.
Read X-Codeer-Signature and X-Codeer-Timestamp, then verify the signature with {timestamp}.{raw_payload}:
import hmac
import hashlib
def verify_webhook(payload: bytes, signature: str, timestamp: str, secret: str) -> bool:
signed_payload = f"{timestamp}.".encode("utf-8") + payload
expected = hmac.new(
secret.encode("utf-8"),
signed_payload,
hashlib.sha256
).hexdigest()
return hmac.compare_digest(f"sha256={expected}", signature)
Webhook Events
| Event | Description |
|---|---|
batch.completed |
All requests in the batch have been processed |
batch.failed |
Batch processing failed |
batch.cancelled |
Batch was cancelled |
Code Examples
Python
import requests
import time
API_KEY = "your-api-key"
BASE_URL = "https://api.codeer.ai/api/v1"
headers = {
"x-api-key": API_KEY,
"Content-Type": "application/json"
}
# Create a batch
batch_response = requests.post(
f"{BASE_URL}/batches",
headers=headers,
json={
"agent_id": "your-agent-id",
"requests": [
{"custom_id": f"req-{i}", "message": f"Question {i}"}
for i in range(10)
]
}
)
batch = batch_response.json()["data"]
batch_id = batch["id"]
print(f"Created batch: {batch_id}")
# Poll for completion
while True:
status_response = requests.get(
f"{BASE_URL}/batches/{batch_id}",
headers=headers
)
status = status_response.json()["data"]["status"]
print(f"Batch status: {status}")
if status in ["completed", "failed", "cancelled"]:
break
time.sleep(5)
# Get results
results_response = requests.get(
f"{BASE_URL}/batches/{batch_id}/results",
headers=headers
)
results = results_response.json()["data"]
for result in results:
print(f"{result['custom_id']}: {result['status']}")
if result["response_content"]:
print(f" Response: {result['response_content'][:100]}...")
JavaScript
const API_KEY = 'your-api-key';
const BASE_URL = 'https://api.codeer.ai/api/v1';
const headers = {
'x-api-key': API_KEY,
'Content-Type': 'application/json'
};
// Create a batch
const batchResponse = await fetch(`${BASE_URL}/batches`, {
method: 'POST',
headers,
body: JSON.stringify({
agent_id: 'your-agent-id',
requests: Array.from({ length: 10 }, (_, i) => ({
custom_id: `req-${i}`,
message: `Question ${i}`
}))
})
});
const { data: batch } = await batchResponse.json();
console.log(`Created batch: ${batch.id}`);
// Poll for completion
const pollBatch = async (batchId) => {
while (true) {
const statusResponse = await fetch(`${BASE_URL}/batches/${batchId}`, { headers });
const { data } = await statusResponse.json();
console.log(`Batch status: ${data.status}`);
if (['completed', 'failed', 'cancelled'].includes(data.status)) {
return data;
}
await new Promise(resolve => setTimeout(resolve, 5000));
}
};
await pollBatch(batch.id);
// Get results
const resultsResponse = await fetch(`${BASE_URL}/batches/${batch.id}/results`, { headers });
const { data: results } = await resultsResponse.json();
results.forEach(result => {
console.log(`${result.custom_id}: ${result.status}`);
if (result.response_content) {
console.log(` Response: ${result.response_content.slice(0, 100)}...`);
}
});