Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -38,3 +38,5 @@ Optional args:

[CONVERSATION]
{{ conversation }}

Determine which function to call to proceed. Your output must be in JSON format.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

1. Toolcall suppressed by json 🐞 Bug ✓ Correctness

The routing agent instruction now mandates JSON output, which can cause tool-capable providers to
return a normal assistant message instead of a tool/function call. When the routing agent is invoked
via InvokeAgent, non-function responses are overwritten with a generic apology, breaking routing.
Agent Prompt
### Issue description
The router instruction now forces “output must be in JSON”, which can suppress native tool calls on providers like OpenAI. When a routing agent is invoked via `RoutingService.InvokeAgent`, any non-tool-call output is replaced with a generic apology, so routing fails.

### Issue Context
OpenAI provider supplies agent functions as `options.Tools`, and only returns `AgentRole.Function` when the model emits a tool call. `InvokeAgent` requires `AgentRole.Function` for routing agents; otherwise it overwrites with an apology.

### Fix Focus Areas
- src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[39-42]
- src/Infrastructure/BotSharp.Core/Routing/RoutingService.InvokeAgent.cs[50-81]
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[369-406]

### Suggested changes
1. Update the instruction to prefer tool calling explicitly (e.g., “Call one of the available functions; do not respond with a JSON message body”), OR scope the JSON requirement to reasoner templates instead of the global router instruction.
2. Optionally add a fallback in `InvokeAgent` for routing agents: if `response.Role != Function` but `response.Content` contains a valid `FunctionCallFromLlm` JSON, parse it and execute the corresponding function rather than overwriting with apology.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

2. Tool args ignored by reasoner 🐞 Bug ⛯ Reliability

Even with the new JSON requirement, tool-capable providers may still return tool calls with
arguments in FunctionArgs and empty Content. OneStepForwardReasoner parses only
response.Content, so it can deserialize {} and produce a routing instruction missing required
routing data, breaking routing execution.
Agent Prompt
### Issue description
`OneStepForwardReasoner` (and similarly `HFReasoner`) parses only `response.Content` as JSON. With tool-calling models/providers, routing instructions may be returned as a tool call where the JSON is in `response.FunctionArgs` and `response.Content` is empty, causing deserialization of `{}` and malformed routing instructions.

### Issue Context
OpenAI provider returns `AgentRole.Function` with `FunctionArgs` populated on tool calls. `NaiveReasoner` already handles this by parsing `(response.FunctionArgs ?? response.Content)`.

### Fix Focus Areas
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/OneStepForwardReasoner.cs[64-71]
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/HFReasoner.cs[53-60]
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/NaiveReasoner.cs[57-62]

### Suggested changes
1. Change OneStepForwardReasoner to parse `(response.FunctionArgs ?? response.Content).JsonContent<FunctionCallFromLlm>()`.
2. Apply the same change to HFReasoner.
3. After code is robust, consider removing/scoping the new “output must be in JSON” directive from the global router instruction to avoid suppressing tool calls.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

Loading