Skip to content

Enhance Prompt logic for Model update to 5-series#1309

Merged
iceljc merged 1 commit intoSciSharp:masterfrom
jlilessen:patch-2
Mar 16, 2026
Merged

Enhance Prompt logic for Model update to 5-series#1309
iceljc merged 1 commit intoSciSharp:masterfrom
jlilessen:patch-2

Conversation

@jlilessen
Copy link
Contributor

Added a small piece to enhance the prompt logic for a stable toll calling results when shifting to 5-series model; Also proved to have no influence for previous 4-series model usage.

Added a small piece to enhance the prompt logic for a stable toll calling results when shifting to 5-series model; Also proved to have no influence for previous 4-series model usage.
@qodo-code-review
Copy link
Contributor

Review Summary by Qodo

Add JSON format instruction to agent prompt template

✨ Enhancement

Grey Divider

Walkthroughs

Description
• Add explicit JSON format instruction to prompt template
• Clarify function calling output requirement for model
• Improve prompt consistency for 5-series model compatibility
Diagram
flowchart LR
  A["Agent Instruction Template"] -- "Add JSON format directive" --> B["Enhanced Prompt Logic"]
  B -- "Improves 5-series model compatibility" --> C["Stable Function Calling Results"]
Loading

Grey Divider

File Changes

1. src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid ✨ Enhancement +2/-0

Add JSON format requirement to prompt

• Added explicit instruction requiring JSON format output
• Clarifies function selection requirement for model
• Enhances prompt logic for improved 5-series model behavior

src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid


Grey Divider

Qodo Logo

@qodo-code-review
Copy link
Contributor

qodo-code-review bot commented Mar 16, 2026

Code Review by Qodo

🐞 Bugs (2) 📘 Rule violations (0) 📎 Requirement gaps (0)

Grey Divider


Action required

1. Toolcall suppressed by JSON 🐞 Bug ✓ Correctness
Description
The routing agent instruction now mandates JSON output, which can cause tool-capable providers to
return a normal assistant message instead of a tool/function call. When the routing agent is invoked
via InvokeAgent, non-function responses are overwritten with a generic apology, breaking routing.
Code

src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[42]

+Determine which function to call to proceed. Your output must be in JSON format.
Evidence
The changed instruction explicitly requires JSON output. For OpenAI, functions referenced by the
instruction are surfaced as Tools; if the model follows the JSON-text requirement instead of
emitting a tool call, the response role will be Assistant (not Function). For routing agents,
InvokeAgent replaces such outputs with a generic apology, so routing instructions are lost.

src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[39-42]
src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[369-406]
src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[57-102]
src/Infrastructure/BotSharp.Core/Routing/RoutingService.InvokeAgent.cs[50-81]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
The router instruction now forces “output must be in JSON”, which can suppress native tool calls on providers like OpenAI. When a routing agent is invoked via `RoutingService.InvokeAgent`, any non-tool-call output is replaced with a generic apology, so routing fails.

### Issue Context
OpenAI provider supplies agent functions as `options.Tools`, and only returns `AgentRole.Function` when the model emits a tool call. `InvokeAgent` requires `AgentRole.Function` for routing agents; otherwise it overwrites with an apology.

### Fix Focus Areas
- src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[39-42]
- src/Infrastructure/BotSharp.Core/Routing/RoutingService.InvokeAgent.cs[50-81]
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[369-406]

### Suggested changes
1. Update the instruction to prefer tool calling explicitly (e.g., “Call one of the available functions; do not respond with a JSON message body”), OR scope the JSON requirement to reasoner templates instead of the global router instruction.
2. Optionally add a fallback in `InvokeAgent` for routing agents: if `response.Role != Function` but `response.Content` contains a valid `FunctionCallFromLlm` JSON, parse it and execute the corresponding function rather than overwriting with apology.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


2. Tool args ignored by reasoner 🐞 Bug ⛯ Reliability
Description
Even with the new JSON requirement, tool-capable providers may still return tool calls with
arguments in FunctionArgs and empty Content. OneStepForwardReasoner parses only
response.Content, so it can deserialize {} and produce a routing instruction missing required
routing data, breaking routing execution.
Code

src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[42]

+Determine which function to call to proceed. Your output must be in JSON format.
Evidence
The PR attempts to enforce JSON output, but OpenAI can still emit a tool call (Role=Function) where
arguments are in FunctionArgs, while Content is often empty. OneStepForwardReasoner ignores
FunctionArgs and always parses Content, unlike NaiveReasoner which correctly falls back to
FunctionArgs first; this makes routing brittle under tool-calling models (e.g., GPT-5).

src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[39-42]
src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[57-75]
src/Infrastructure/BotSharp.Core/Routing/Reasoning/OneStepForwardReasoner.cs[64-71]
src/Infrastructure/BotSharp.Core/Routing/Reasoning/NaiveReasoner.cs[57-62]

Agent prompt
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution

### Issue description
`OneStepForwardReasoner` (and similarly `HFReasoner`) parses only `response.Content` as JSON. With tool-calling models/providers, routing instructions may be returned as a tool call where the JSON is in `response.FunctionArgs` and `response.Content` is empty, causing deserialization of `{}` and malformed routing instructions.

### Issue Context
OpenAI provider returns `AgentRole.Function` with `FunctionArgs` populated on tool calls. `NaiveReasoner` already handles this by parsing `(response.FunctionArgs ?? response.Content)`.

### Fix Focus Areas
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/OneStepForwardReasoner.cs[64-71]
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/HFReasoner.cs[53-60]
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/NaiveReasoner.cs[57-62]

### Suggested changes
1. Change OneStepForwardReasoner to parse `(response.FunctionArgs ?? response.Content).JsonContent<FunctionCallFromLlm>()`.
2. Apply the same change to HFReasoner.
3. After code is robust, consider removing/scoping the new “output must be in JSON” directive from the global router instruction to avoid suppressing tool calls.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools


Grey Divider

ⓘ The new review experience is currently in Beta. Learn more

Grey Divider

Qodo Logo

@iceljc iceljc merged commit f02ffef into SciSharp:master Mar 16, 2026
3 of 4 checks passed
[CONVERSATION]
{{ conversation }}

Determine which function to call to proceed. Your output must be in JSON format.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

1. Toolcall suppressed by json 🐞 Bug ✓ Correctness

The routing agent instruction now mandates JSON output, which can cause tool-capable providers to
return a normal assistant message instead of a tool/function call. When the routing agent is invoked
via InvokeAgent, non-function responses are overwritten with a generic apology, breaking routing.
Agent Prompt
### Issue description
The router instruction now forces “output must be in JSON”, which can suppress native tool calls on providers like OpenAI. When a routing agent is invoked via `RoutingService.InvokeAgent`, any non-tool-call output is replaced with a generic apology, so routing fails.

### Issue Context
OpenAI provider supplies agent functions as `options.Tools`, and only returns `AgentRole.Function` when the model emits a tool call. `InvokeAgent` requires `AgentRole.Function` for routing agents; otherwise it overwrites with an apology.

### Fix Focus Areas
- src/Infrastructure/BotSharp.Core/data/agents/01fcc3e5-9af7-49e6-ad7a-a760bd12dc4a/instructions/instruction.liquid[39-42]
- src/Infrastructure/BotSharp.Core/Routing/RoutingService.InvokeAgent.cs[50-81]
- src/Plugins/BotSharp.Plugin.OpenAI/Providers/Chat/ChatCompletionProvider.cs[369-406]

### Suggested changes
1. Update the instruction to prefer tool calling explicitly (e.g., “Call one of the available functions; do not respond with a JSON message body”), OR scope the JSON requirement to reasoner templates instead of the global router instruction.
2. Optionally add a fallback in `InvokeAgent` for routing agents: if `response.Role != Function` but `response.Content` contains a valid `FunctionCallFromLlm` JSON, parse it and execute the corresponding function rather than overwriting with apology.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

[CONVERSATION]
{{ conversation }}

Determine which function to call to proceed. Your output must be in JSON format.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

2. Tool args ignored by reasoner 🐞 Bug ⛯ Reliability

Even with the new JSON requirement, tool-capable providers may still return tool calls with
arguments in FunctionArgs and empty Content. OneStepForwardReasoner parses only
response.Content, so it can deserialize {} and produce a routing instruction missing required
routing data, breaking routing execution.
Agent Prompt
### Issue description
`OneStepForwardReasoner` (and similarly `HFReasoner`) parses only `response.Content` as JSON. With tool-calling models/providers, routing instructions may be returned as a tool call where the JSON is in `response.FunctionArgs` and `response.Content` is empty, causing deserialization of `{}` and malformed routing instructions.

### Issue Context
OpenAI provider returns `AgentRole.Function` with `FunctionArgs` populated on tool calls. `NaiveReasoner` already handles this by parsing `(response.FunctionArgs ?? response.Content)`.

### Fix Focus Areas
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/OneStepForwardReasoner.cs[64-71]
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/HFReasoner.cs[53-60]
- src/Infrastructure/BotSharp.Core/Routing/Reasoning/NaiveReasoner.cs[57-62]

### Suggested changes
1. Change OneStepForwardReasoner to parse `(response.FunctionArgs ?? response.Content).JsonContent<FunctionCallFromLlm>()`.
2. Apply the same change to HFReasoner.
3. After code is robust, consider removing/scoping the new “output must be in JSON” directive from the global router instruction to avoid suppressing tool calls.

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants