Skip to content

Fixed an issue where AI Reports fail with OpenAI models that do not s…#9725

Open
dpage wants to merge 1 commit intopgadmin-org:masterfrom
dpage:fix_model_temp_errors
Open

Fixed an issue where AI Reports fail with OpenAI models that do not s…#9725
dpage wants to merge 1 commit intopgadmin-org:masterfrom
dpage:fix_model_temp_errors

Conversation

@dpage
Copy link
Contributor

@dpage dpage commented Mar 10, 2026

…upport the temperature parameter. #9719

Removed the temperature parameter from all LLM provider clients and pipeline calls, allowing each model to use its default. This fixes compatibility with GPT-5-mini/nano and future models that don't support user-configurable temperature.

Summary by CodeRabbit

  • Bug Fixes
    • Fixed AI Reports failing when using OpenAI models that do not support the temperature parameter. The application now removes temperature settings from all AI provider integrations to ensure broader compatibility across different LLM services and models.

…upport the temperature parameter. pgadmin-org#9719

Removed the temperature parameter from all LLM provider clients and
pipeline calls, allowing each model to use its default. This fixes
compatibility with GPT-5-mini/nano and future models that don't
support user-configurable temperature.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link

coderabbitai bot commented Mar 10, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 0786083c-7135-41c9-8d14-f5425b13adfa

📥 Commits

Reviewing files that changed from the base of the PR and between a0e6da0 and a898c15.

📒 Files selected for processing (7)
  • docs/en_US/release_notes_9_14.rst
  • web/pgadmin/llm/client.py
  • web/pgadmin/llm/providers/anthropic.py
  • web/pgadmin/llm/providers/docker.py
  • web/pgadmin/llm/providers/ollama.py
  • web/pgadmin/llm/providers/openai.py
  • web/pgadmin/llm/reports/pipeline.py
💤 Files with no reviewable changes (5)
  • web/pgadmin/llm/providers/ollama.py
  • web/pgadmin/llm/providers/docker.py
  • web/pgadmin/llm/providers/anthropic.py
  • web/pgadmin/llm/providers/openai.py
  • web/pgadmin/llm/client.py

Walkthrough

The changes remove the temperature parameter from the LLM client interface and all provider implementations (OpenAI, Anthropic, Ollama, Docker), plus from the report pipeline usage. A release note documents the fix for AI Reports failing with OpenAI models that don't support the temperature parameter.

Changes

Cohort / File(s) Summary
Release Notes
docs/en_US/release_notes_9_14.rst
Added bug fix entry for Issue #9719 regarding AI Reports failing with OpenAI models lacking temperature parameter support.
LLM Client Base
web/pgadmin/llm/client.py
Removed temperature parameter from LLMClient.chat() method signature and updated docstring accordingly.
LLM Provider Implementations
web/pgadmin/llm/providers/anthropic.py, web/pgadmin/llm/providers/docker.py, web/pgadmin/llm/providers/ollama.py, web/pgadmin/llm/providers/openai.py
Removed temperature parameter from chat() method signatures and corresponding payload construction across all provider clients.
LLM Report Pipeline
web/pgadmin/llm/reports/pipeline.py
Removed temperature argument from LLM API calls in planning, analysis, synthesis, and generic retry functions.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~15 minutes

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The PR title accurately summarizes the main change: removing the temperature parameter to fix AI Reports failures with OpenAI models that don't support it.
Docstring Coverage ✅ Passed Docstring coverage is 100.00% which is sufficient. The required threshold is 80.00%.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant