fix: Wrap non-JSON tool responses for Gemini compatibility#1392
fix: Wrap non-JSON tool responses for Gemini compatibility#1392azanux wants to merge 2 commits intoembabel:mainfrom
Conversation
Signed-off-by: Charles <azanux@gmail.com>
…son-response-gemini
|
@azanux — thanks for identifying this and the original fix! I took a slightly different approach: introduced a pluggable ToolResponseContentAdapter strategy and wired it via Gemini auto-configuration, so the JSON wrapping is localized to Gemini only. Other providers keep getting plain text unchanged. |
|
Thanks for the fix, I'll test all of this by this weekend and I'll confirm, if all goes well. |
|
The ToolResponseContentAdapter strategy is really elegant - localizing the JSON wrapping to Gemini is much cleaner. One thing though: in ChatClientLlmOperations.createMessageSender() line 183, when llmRequestEvent != null (observability enabled), the SpringAiLlmMessageSender is created without the adapter , so it falls back to PASSTHROUGH instead of the Gemini JSON wrapper: (so we have silent failing and hallucination) // current Thanks again for the fix! |
#1391
Summary
Google GenAI (Gemini) requires tool responses (FunctionResponse.response) to be valid JSON objects. When a tool returns a plain text string, Spring AI's GoogleGenAiChatModel.parseJsonToMap() either:
The same tools work correctly with OpenAI, which accepts plain text in tool response content fields.
Root cause
All tool results flow through messageConverters.kt before being sent to the LLM. The textContent was passed as-is to ToolResponseMessage.ToolResponse.responseData without ensuring it's valid JSON.
Most tools worked "by luck" because MethodTool.convertResult() serializes non-String return types to JSON via Jackson only tools returning a raw String (or implementing Tool directly like MatryoshkaTool) were affected.
Fix
Test plan