Skip to content

fix(provider): fix Anthropic custom headers and system prompt compatibility#7587

Open
PinkYuDeer wants to merge 3 commits intoAstrBotDevs:masterfrom
PinkYuDeer:fix/anthropic-custom-headers-and-system-prompt
Open

fix(provider): fix Anthropic custom headers and system prompt compatibility#7587
PinkYuDeer wants to merge 3 commits intoAstrBotDevs:masterfrom
PinkYuDeer:fix/anthropic-custom-headers-and-system-prompt

Conversation

@PinkYuDeer
Copy link
Copy Markdown

@PinkYuDeer PinkYuDeer commented Apr 15, 2026

动机

在使用 AstrBot 桌面版通过第三方 API 代理(如 Cloudflare 保护的转发服务)接入 Anthropic Claude 时,遇到了两个阻断性问题,导致无法正常使用。

解决的问题

  1. 自定义请求头导致提供商加载失败

当用户在配置页面添加自定义请求头(如 User-Agent)时,原代码会创建一个httpx.AsyncClient实例并传递给 AsyncAnthropic(http_client=...)。在某些环境下(如 AstrBot 桌面版内置 Python 与系统 Python 共存),sys.path 中存在多个 httpx 安装路径,导致 SDK 内部的 isinstance(http_client, httpx.AsyncClient) 检查失败,报错:

Invalid http_client argument; Expected an instance of httpx.AsyncClient but got <class 'httpx.AsyncClient'>

  1. system 参数格式不兼容第三方代理

原代码将 system 参数以纯字符串格式("system": "You are...")传递。部分第三方 API 代理严格检查 list 格式,收到 string
格式时返回 400 错误:

Invalid JSON in request body

Modifications / 改动点

astrbot/core/provider/sources/anthropic_source.py

  • _init_api_key:新增 default_headers=self.custom_headers 参数传递给 AsyncAnthropic,利用 SDK 原生的请求头合并机制,不再依赖自定义 httpx.AsyncClient 来携带请求头,这与配置中描述的“此处添加的键值对将被合并到 OpenAI SDK 的 default_headers 中,用于自定义 HTTP 请求头。” 是一致的

  • _create_http_client:移除了仅因自定义请求头而创建 httpx.AsyncClient 的逻辑,该方法现在仅在配置了代理时才创建客户端。

  • text_chat / text_chat_stream:将 system 参数从 string 格式改为 list 格式([{"type": "text", "text": system_prompt}])。list 格式是 Anthropic 官方 API 支持的标准格式,同时兼容各类第三方代理。

  • This is NOT a breaking change. / 这不是一个破坏性变更。

Screenshots or Test Results / 运行截图或测试结果

image image image image

Checklist / 检查清单

  • 😊 If there are new features added in the PR, I have discussed it with the authors through issues/emails, etc.
    / 如果 PR 中有新加入的功能,已经通过 Issue / 邮件等方式和作者讨论过。

  • 👀 My changes have been well-tested, and "Verification Steps" and "Screenshots" have been provided above.
    / 我的更改经过了良好的测试,并已在上方提供了“验证步骤”和“运行截图”

  • 🤓 I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations in requirements.txt and pyproject.toml.
    / 我确保没有引入新依赖库,或者引入了新依赖库的同时将其添加到 requirements.txtpyproject.toml 文件相应位置。

  • 😮 My changes do not introduce malicious code.
    / 我的更改没有引入恶意代码。

Summary by Sourcery

Fix Anthropic provider integration to correctly apply custom headers and use a system prompt format compatible with Anthropic and third-party proxies.

Bug Fixes:

  • Ensure custom headers are passed via AsyncAnthropic default headers instead of a custom httpx.AsyncClient to avoid client type errors.
  • Send system prompts as a list of text blocks rather than a plain string to satisfy Anthropic and strict proxy JSON schema requirements.

…bility

- Pass custom_headers via AsyncAnthropic's `default_headers` parameter
  instead of creating a separate httpx.AsyncClient. This avoids
  `isinstance` check failures when multiple httpx installations exist
  on sys.path (e.g. bundled Python + system Python).

- Use list format for the `system` parameter (`[{"type": "text", ...}]`)
  instead of a plain string. The list format is supported by the official
  Anthropic API and is also compatible with third-party API proxies that
  reject the string format.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@auto-assign auto-assign bot requested review from Raven95676 and anka-afk April 15, 2026 15:53
@dosubot dosubot bot added size:XS This PR changes 0-9 lines, ignoring generated files. area:provider The bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner. labels Apr 15, 2026
Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've left some high level feedback:

  • The system_prompt is now always wrapped into a list of {type: text, text: ...} objects; if any existing callers already pass a list or structured system content for Anthropic, this will double-wrap and break them, so it may be safer to detect and pass through list/dict values unchanged.
  • By moving custom_headers from the httpx.AsyncClient into default_headers for AsyncAnthropic, any headers intended specifically for the underlying HTTP client (e.g., proxy authentication or non-API headers) will now be sent as API-level headers; consider clarifying or separating API headers vs transport/proxy headers in the configuration to avoid unintended header placement.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- The `system_prompt` is now always wrapped into a list of `{type: text, text: ...}` objects; if any existing callers already pass a list or structured system content for Anthropic, this will double-wrap and break them, so it may be safer to detect and pass through list/dict values unchanged.
- By moving `custom_headers` from the `httpx.AsyncClient` into `default_headers` for `AsyncAnthropic`, any headers intended specifically for the underlying HTTP client (e.g., proxy authentication or non-API headers) will now be sent as API-level headers; consider clarifying or separating API headers vs transport/proxy headers in the configuration to avoid unintended header placement.

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request refactors the Anthropic provider by moving custom header configuration to the client initialization and updating the system prompt format to use content blocks. Review feedback suggests adding type checks for the system prompt in both text_chat and text_chat_stream methods to prevent incorrect wrapping if the prompt is already provided as a list.

Comment thread astrbot/core/provider/sources/anthropic_source.py Outdated
Comment thread astrbot/core/provider/sources/anthropic_source.py Outdated
PinkYuDeer and others added 2 commits April 16, 2026 00:03
…bility

- Pass custom_headers via AsyncAnthropic's `default_headers` parameter
  instead of creating a separate httpx.AsyncClient. This avoids
  `isinstance` check failures when multiple httpx installations exist
  on sys.path (e.g. bundled Python + system Python).

- Use list format for the `system` parameter (`[{"type": "text", ...}]`)
  instead of a plain string. The list format is supported by the official
  Anthropic API and is also compatible with third-party API proxies that
  reject the string format.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…bility

- Pass custom_headers via AsyncAnthropic's `default_headers` parameter
  instead of creating a separate httpx.AsyncClient. This avoids
  `isinstance` check failures when multiple httpx installations exist
  on sys.path (e.g. bundled Python + system Python).

- Use list format for the `system` parameter (`[{"type": "text", ...}]`)
  instead of a plain string. The list format is supported by the official
  Anthropic API and is also compatible with third-party API proxies that
  reject the string format. When system_prompt is already a list, it is
  passed through as-is.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area:provider The bug / feature is about AI Provider, Models, LLM Agent, LLM Agent Runner. size:XS This PR changes 0-9 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant