TanStack AI version
v0.6.1
Framework/Library version
N/A
Describe the bug and the steps to reproduce it
When using the openRouterText adapter with any model, the stream terminates immediately upon receiving a TOOL_END event, but before the tool is actually called or executed.
const stream = chat({
adapter: openRouterText("minimax/minimax-m2.5"),
messages,
conversationId,
tools: [serverTool],
});
Steps to reproduce:
- Send a prompt that triggers serverTool.
- Monitor the SSE stream.
- Observe TOOL_START followed immediately by TOOL_END.
- Result: The serverTool function is never invoked, and the connection closes.
- Send a second message (e.g., "Why didn't you run the tool?").
- The model acknowledges the previous state and execute the tool
NOTE:
I fixed this in a fork by debugging the stream handling logic. The issue was that the for (const choice of chunk.choices) loop was skipping the final message chunk entirely because the choices array was empty, preventing the tool execution from being triggered.
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
https://github.com/andorep/openrouter-error
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
Yes, I think I know how to fix it and will discuss it in the comments of this issue
Terms & Code of Conduct
TanStack AI version
v0.6.1
Framework/Library version
N/A
Describe the bug and the steps to reproduce it
When using the openRouterText adapter with any model, the stream terminates immediately upon receiving a TOOL_END event, but before the tool is actually called or executed.
Steps to reproduce:
NOTE:
I fixed this in a fork by debugging the stream handling logic. The issue was that the for (const choice of chunk.choices) loop was skipping the final message chunk entirely because the choices array was empty, preventing the tool execution from being triggered.
Your Minimal, Reproducible Example - (Sandbox Highly Recommended)
https://github.com/andorep/openrouter-error
Screenshots or Videos (Optional)
No response
Do you intend to try to help solve this bug with your own PR?
Yes, I think I know how to fix it and will discuss it in the comments of this issue
Terms & Code of Conduct