-
Notifications
You must be signed in to change notification settings - Fork 10
feat(ai-proxy): add Anthropic LLM provider support #1435
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
Add support for Anthropic's Claude models in the ai-proxy package using @langchain/anthropic. This allows users to configure Claude as their AI provider alongside OpenAI. Changes: - Add @langchain/anthropic dependency - Add ANTHROPIC_MODELS constant with supported Claude models - Add AnthropicConfiguration type and AnthropicModel type - Add AnthropicUnprocessableError for Anthropic-specific errors - Implement message conversion from OpenAI format to LangChain format - Implement response conversion from LangChain format back to OpenAI format - Add tool binding support for Anthropic with tool_choice conversion - Add comprehensive tests for Anthropic provider Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1 new issue
|
|
Coverage Impact This PR will not change total coverage. 🚦 See full report on Qlty Cloud »🛟 Help
|
| tool_calls: msg.tool_calls.map(tc => ({ | ||
| id: tc.id, | ||
| name: tc.function.name, | ||
| args: JSON.parse(tc.function.arguments), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
JSON.parse() can throw if tc.function.arguments contains malformed JSON. This call is outside the try-catch block (line 210), so errors would propagate as unhandled SyntaxError instead of being wrapped in AnthropicUnprocessableError.
OpenAI API can return malformed JSON in edge cases (documented issues).
Fix packages/ai-proxy/src/provider-dispatcher.ts:259: Wrap JSON.parse in try-catch or move convertMessagesToLangChain call inside the existing try-catch block at line 210
| }); | ||
| } | ||
|
|
||
| return new AIMessage(msg.content); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LangChain message constructors (SystemMessage, HumanMessage, AIMessage) fail with null content. OpenAI API allows content: null for assistant messages (API spec).
The OpenAIMessage interface at line 117 defines content: string, but should allow null. Line 255 handles this correctly with msg.content || '', but lines 249, 251, 264, 267, 271 pass content directly.
| return new AIMessage(msg.content); | |
| return new AIMessage(msg.content || '\); |
… dispatcher - Move convertMessagesToLangChain inside try-catch to properly handle JSON.parse errors - Update OpenAIMessage interface to allow null content (per OpenAI API spec) - Add null content handling for all message types with fallback to empty string Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>

Summary
@forestadmin/ai-proxyusing@langchain/anthropicname,provider,model,apiKey)Changes
@langchain/anthropicdependencyAnthropicConfigurationtype with all Anthropic-specific optionsANTHROPIC_MODELSconstant with supported Claude modelsAnthropicUnprocessableErrorfor error handlingTest plan
🤖 Generated with Claude Code