Skip to content

fix: skip temperature param for OpenAI reasoning/newer models in summary tool#252

Open
octo-patch wants to merge 1 commit intoAlibaba-NLP:mainfrom
octo-patch:fix/issue-211-temperature-unsupported-new-models
Open

fix: skip temperature param for OpenAI reasoning/newer models in summary tool#252
octo-patch wants to merge 1 commit intoAlibaba-NLP:mainfrom
octo-patch:fix/issue-211-temperature-unsupported-new-models

Conversation

@octo-patch
Copy link
Copy Markdown

Fixes #211

Problem

When using newer OpenAI models (o1, o3, o4 reasoning models, gpt-5, etc.) as the summary model (SUMMARY_MODEL_NAME), the call_server method in both inference/tool_visit.py and WebAgent/WebResummer/src/tool_visit.py unconditionally passes temperature=0.7. These model families do not support the temperature parameter and will return an API error.

Solution

Before building the API call kwargs, check whether the model name starts with a prefix known to not support temperature (o1, o3, o4, gpt-5). Only include temperature in the request when the model supports it.

_NO_TEMPERATURE_PREFIXES = ("o1", "o3", "o4", "gpt-5")
create_kwargs = dict(model=model_name, messages=msgs)
if not any(model_name.startswith(p) for p in _NO_TEMPERATURE_PREFIXES):
    create_kwargs["temperature"] = 0.7
chat_response = client.chat.completions.create(**create_kwargs)

Files changed

  • inference/tool_visit.pyVisit.call_server()
  • WebAgent/WebResummer/src/tool_visit.pyVisit.call_server()

Testing

Tested locally by setting SUMMARY_MODEL_NAME=o1-mini and verifying no temperature is passed; and with SUMMARY_MODEL_NAME=gpt-4o to confirm temperature=0.7 is still included for standard models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

OPENAI GPT Summary does not support Temperature for new models like Gpt-5

1 participant