Summary
The Mistral Batch Jobs API (POST /v1/batch/jobs) is not instrumented. Calls to client.batch.jobs.create() produce zero Braintrust tracing. This is a documented, GA production feature on the Mistral platform for running batch inference across chat completions, embeddings, FIM, OCR, audio transcriptions, moderations, and classifications.
The Braintrust Anthropic integration does instrument an equivalent batch API (client.messages.batches.create() and client.messages.batches.results()), making this an asymmetry across providers in this repo.
What is missing
| Mistral Resource |
Method |
Instrumented? |
client.chat |
complete(), stream() |
Yes |
client.embeddings |
create() |
Yes |
client.fim |
complete(), stream() |
Yes |
client.agents |
complete(), stream() |
Yes |
client.batch.jobs |
create() |
No |
client.batch.jobs |
get(), list(), cancel() |
No (CRUD — lower priority) |
The generative-execution-relevant surfaces are batch.jobs.create() (submitting a batch of model inference requests) and retrieving the results (via output file download). The list/get/cancel methods are CRUD management and lower priority.
At minimum, instrumentation for batch.jobs.create() should create a span capturing:
- Input: list of custom IDs from the batch requests, number of requests, target endpoint
- Output: batch job ID, processing status, request counts
- Metrics: latency (submission time)
- Metadata: model(s), target endpoint, batch size, timeout configuration
This mirrors the pattern used by the Anthropic batch instrumentation, which creates a task-type span with input custom IDs, output status/counts, and metadata including model and number of requests.
Batch API capabilities
The Mistral batch API supports batching requests to all major generative endpoints:
/v1/chat/completions
/v1/embeddings
/v1/fim/completions
/v1/ocr
/v1/audio/transcriptions
/v1/moderations, /v1/chat/moderations
/v1/classifications
Both file-based batching (up to 1M requests via JSONL upload) and inline batching (up to 10K requests embedded in the create call) are supported.
Braintrust docs status
not_found — The Mistral integration page documents chat completions, embeddings, FIM, and agents. No mention of batch API support.
Upstream sources
Local files inspected
py/src/braintrust/integrations/mistral/patchers.py — defines patchers for Chat, Embeddings, Fim, Agents; zero references to batch, jobs, or Batch
py/src/braintrust/integrations/mistral/tracing.py — wrapper functions for chat, embeddings, FIM, agents only; no batch wrappers
py/src/braintrust/integrations/mistral/integration.py — integration class registers 4 composite patchers (ChatPatcher, EmbeddingsPatcher, FimPatcher, AgentsPatcher); no BatchPatcher
py/src/braintrust/integrations/mistral/test_mistral.py — no batch test cases
py/src/braintrust/integrations/anthropic/tracing.py — Anthropic batch API IS instrumented (BatchesCreate, BatchesResults classes) as precedent
py/noxfile.py — test_mistral session tests against LATEST and 1.12.4; no batch coverage
Relationship to existing issues
Summary
The Mistral Batch Jobs API (
POST /v1/batch/jobs) is not instrumented. Calls toclient.batch.jobs.create()produce zero Braintrust tracing. This is a documented, GA production feature on the Mistral platform for running batch inference across chat completions, embeddings, FIM, OCR, audio transcriptions, moderations, and classifications.The Braintrust Anthropic integration does instrument an equivalent batch API (
client.messages.batches.create()andclient.messages.batches.results()), making this an asymmetry across providers in this repo.What is missing
client.chatcomplete(),stream()client.embeddingscreate()client.fimcomplete(),stream()client.agentscomplete(),stream()client.batch.jobscreate()client.batch.jobsget(),list(),cancel()The generative-execution-relevant surfaces are
batch.jobs.create()(submitting a batch of model inference requests) and retrieving the results (via output file download). The list/get/cancel methods are CRUD management and lower priority.At minimum, instrumentation for
batch.jobs.create()should create a span capturing:This mirrors the pattern used by the Anthropic batch instrumentation, which creates a
task-type span with input custom IDs, output status/counts, and metadata including model and number of requests.Batch API capabilities
The Mistral batch API supports batching requests to all major generative endpoints:
/v1/chat/completions/v1/embeddings/v1/fim/completions/v1/ocr/v1/audio/transcriptions/v1/moderations,/v1/chat/moderations/v1/classificationsBoth file-based batching (up to 1M requests via JSONL upload) and inline batching (up to 10K requests embedded in the create call) are supported.
Braintrust docs status
not_found — The Mistral integration page documents chat completions, embeddings, FIM, and agents. No mention of batch API support.
Upstream sources
mistralaion PyPI):client.batch.jobs.create(),client.batch.jobs.get(),client.batch.jobs.list(),client.batch.jobs.cancel()Local files inspected
py/src/braintrust/integrations/mistral/patchers.py— defines patchers forChat,Embeddings,Fim,Agents; zero references tobatch,jobs, orBatchpy/src/braintrust/integrations/mistral/tracing.py— wrapper functions for chat, embeddings, FIM, agents only; no batch wrapperspy/src/braintrust/integrations/mistral/integration.py— integration class registers 4 composite patchers (ChatPatcher, EmbeddingsPatcher, FimPatcher, AgentsPatcher); no BatchPatcherpy/src/braintrust/integrations/mistral/test_mistral.py— no batch test casespy/src/braintrust/integrations/anthropic/tracing.py— Anthropic batch API IS instrumented (BatchesCreate,BatchesResultsclasses) as precedentpy/noxfile.py—test_mistralsession tests againstLATESTand1.12.4; no batch coverageRelationship to existing issues
client.ocr.process()) not instrumented #222 tracks Mistral OCR API (separate endpoint gap)client.audio.speech,client.audio.transcriptions) not instrumented #223 tracks Mistral Audio APIs (separate endpoint gap)