-
Notifications
You must be signed in to change notification settings - Fork 3
Description
Summary
The Anthropic SDK's Message Batches API (client.messages.batches.*) is a stable, GA API surface that is not instrumented at all. This API enables asynchronous batch processing of message requests with 50% cost savings and is particularly relevant for large-scale evaluations.
The current Anthropic wrapper (wrapAnthropic) and auto-instrumentation plugin only instrument messages.create and beta.messages.create. The messages.batches namespace is not proxied, has no channel definitions, and has no plugin handler.
What instrumentation is missing
Wrapper (js/src/wrappers/anthropic.ts)
The anthropicProxy function (lines 36–51) only intercepts beta and messages properties. The messages.batches sub-namespace passes through unmodified.
Vendored types (js/src/vendor-sdk-types/anthropic.ts)
The AnthropicClient interface only declares messages and beta. There is no type for the batches sub-namespace.
Channels (js/src/instrumentation/plugins/anthropic-channels.ts)
Only messagesCreate and betaMessagesCreate channels are defined. No batch-related channels exist.
Plugin (js/src/instrumentation/plugins/anthropic-plugin.ts)
Only subscribes to messagesCreate and betaMessagesCreate channels.
Upstream SDK methods not instrumented
client.messages.batches.create()— submit a batch of up to 100,000 message requestsclient.messages.batches.list()— list batchesclient.messages.batches.retrieve()— get batch statusclient.messages.batches.cancel()— cancel a batchclient.messages.batches.results()— retrieve batch results
Impact
Users running large-scale evaluations or bulk content processing through the Anthropic Message Batches API get no automatic tracing of batch lifecycle operations. Batch creation, status polling, and result retrieval are invisible to Braintrust. This is particularly relevant since:
- The batch API offers 50% cost savings, making it attractive for evaluation workloads
- Braintrust is an evaluation platform — batch processing of LLM requests is a core adjacent use case
- Individual message results from batches are not auto-logged as spans
Braintrust docs status
not_found — No Braintrust documentation mentions the Anthropic Message Batches API. The Anthropic integration docs at https://www.braintrust.dev/docs/instrument/wrap-providers list @anthropic-ai/sdk as supported but do not mention batch processing.
Upstream reference
- Anthropic Batch Processing guide: https://docs.anthropic.com/en/docs/build-with-claude/batch-processing
- Anthropic Batches API reference: https://docs.anthropic.com/en/api/creating-message-batches
- The Message Batches API is GA/stable (not beta)
- Supports all active Anthropic models
- Any request that can be made to the Messages API can be included in a batch (vision, tool use, system messages, multi-turn conversations, beta features)
Local files inspected
js/src/wrappers/anthropic.ts(lines 36–51:anthropicProxyonly interceptsbetaandmessages)js/src/vendor-sdk-types/anthropic.ts(lines 11–13:AnthropicClienthas nobatchestype)js/src/instrumentation/plugins/anthropic-channels.ts(onlymessagesCreateandbetaMessagesCreate)js/src/instrumentation/plugins/anthropic-plugin.ts(only subscribes to message create channels)e2e/scenarios/anthropic-instrumentation/(no batch-related test scenarios)