Skip to content

Update OpenAI to 2.9.0#7349

Open
stephentoub wants to merge 2 commits intodotnet:mainfrom
stephentoub:openai290
Open

Update OpenAI to 2.9.0#7349
stephentoub wants to merge 2 commits intodotnet:mainfrom
stephentoub:openai290

Conversation

@stephentoub
Copy link
Member

@stephentoub stephentoub commented Feb 28, 2026

Microsoft Reviewers: Open in CodeFlow

@stephentoub stephentoub requested review from a team as code owners February 28, 2026 03:22
Copilot AI review requested due to automatic review settings February 28, 2026 03:22
@github-actions github-actions bot added the area-ai Microsoft.Extensions.AI libraries label Feb 28, 2026
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Updates the repo’s OpenAI .NET SDK dependency to 2.9.0 and adapts the OpenAI integration layer (Responses + Realtime + image generation) and tests to the updated SDK surface area.

Changes:

  • Bump OpenAI package version to 2.9.0 and update affected call patterns (notably Responses client model selection and image input handling).
  • Update Responses/Reatime conversion & tooling glue code to match the new SDK types/properties.
  • Adjust tests to new SDK behaviors and payload shapes (e.g., image URL media type inference, instructions representation).

Reviewed changes

Copilot reviewed 12 out of 12 changed files in this pull request and generated 5 comments.

Show a summary per file
File Description
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.cs Updates Responses client creation/adapter usage and expected media type for image URL content.
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientIntegrationTests.cs Updates integration test client creation to new Responses client + model wiring.
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIConversionTests.cs Updates conversion tests for new tool types/properties and instructions representation changes.
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIChatClientTests.cs Updates expected request JSON for model/token fields in chat tests.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs Refactors to accommodate SDK changes (model handling, image URI handling, reasoning effort enum changes).
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIRealtimeConversationClient.cs Renames/updates realtime function tool mapping to new SDK type/property names.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIImageGenerator.cs Adds media-type-to-output-format mapping for image edit options and minor boolean pattern tweaks.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIClientExtensions.cs Updates ResponsesClient AsIChatClient signature to accept a default model id.
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIChatClient.cs Updates reasoning effort mapping and notes a 2.9.0 regression around model override patching.
src/Libraries/Microsoft.Extensions.AI.OpenAI/MicrosoftExtensionsAIResponsesExtensions.cs Adjusts Responses result instructions mapping and updates doc cref signatures.
src/Libraries/Microsoft.Extensions.AI.OpenAI/MicrosoftExtensionsAIRealtimeExtensions.cs Renames realtime extension method to new tool type and updates docs accordingly.
eng/packages/General.props Bumps OpenAI package version from 2.8.0 to 2.9.0.
Comments suppressed due to low confidence (3)

src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIClientExtensions.cs:127

  • defaultModelId is optional, but when omitted it can cause requests to be sent without a model (depending on ChatOptions.ModelId), which is likely to fail at runtime and is a behavior change from the previous implementation that used the response client's model. If possible, consider requiring defaultModelId (or throwing when missing) to avoid silent misconfiguration.
    /// <summary>Gets an <see cref="IChatClient"/> for use with this <see cref="ResponsesClient"/>.</summary>
    /// <param name="responseClient">The client.</param>
    /// <param name="defaultModelId">The default model ID to use for the chat client.</param>
    /// <returns>An <see cref="IChatClient"/> that can be used to converse via the <see cref="ResponsesClient"/>.</returns>
    /// <exception cref="ArgumentNullException"><paramref name="responseClient"/> is <see langword="null"/>.</exception>
    [Experimental(DiagnosticIds.Experiments.AIOpenAIResponses)]
    public static IChatClient AsIChatClient(this ResponsesClient responseClient, string? defaultModelId = null) =>
        new OpenAIResponsesChatClient(responseClient, defaultModelId);

test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIChatClientTests.cs:1684

  • These tests are named as if ChatOptions.ModelId overrides the client model, but the expected request JSON now uses the client model (gpt-4o-mini) while ChatOptions.ModelId is set to gpt-4o. Either update the test name/assertions to reflect the new behavior, or (preferably) keep the test asserting the override behavior once the underlying issue is fixed.

This issue also appears on line 1692 of the same file.

    [Fact]
    public async Task ChatOptions_ModelId_OverridesClientModel_NonStreaming()
    {
        const string Input = """
            {
                "temperature":0.5,
                "messages":[{"role":"user","content":"hello"}],
                "model":"gpt-4o-mini",
                "max_completion_tokens":10
            }
            """;

        const string Output = """
            {
              "id": "chatcmpl-ADx3PvAnCwJg0woha4pYsBTi3ZpOI",
              "object": "chat.completion",
              "created": 1727888631,
              "model": "gpt-4o-2024-08-06",
              "choices": [
                {
                  "index": 0,
                  "message": {
                    "role": "assistant",
                    "content": "Hello! How can I assist you today?",
                    "refusal": null
                  },
                  "logprobs": null,
                  "finish_reason": "stop"
                }
              ],
              "usage": {
                "prompt_tokens": 8,
                "completion_tokens": 9,
                "total_tokens": 17
              }
            }
            """;

        using VerbatimHttpHandler handler = new(Input, Output);
        using HttpClient httpClient = new(handler);
        using IChatClient client = CreateChatClient(httpClient, "gpt-4o-mini");

        var response = await client.GetResponseAsync("hello", new()
        {
            MaxOutputTokens = 10,
            Temperature = 0.5f,
            ModelId = "gpt-4o",
        });

test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIChatClientTests.cs:1731

  • Same issue as the non-streaming variant: the test name indicates ChatOptions.ModelId overrides the client model, but the expected request JSON uses gpt-4o-mini while ChatOptions.ModelId is set to gpt-4o. Please align the test name/assertions with the intended contract.
    [Fact]
    public async Task ChatOptions_ModelId_OverridesClientModel_Streaming()
    {
        const string Input = """
            {
                "temperature":0.5,
                "messages":[{"role":"user","content":"hello"}],
                "model":"gpt-4o-mini",
                "max_completion_tokens":20,
                "stream":true,
                "stream_options":{"include_usage":true}
            }
            """;

        const string Output = """
            data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{"role":"assistant","content":"","refusal":null},"logprobs":null,"finish_reason":null}],"usage":null}

            data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{"content":"Hello"},"logprobs":null,"finish_reason":null}],"usage":null}

            data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{"content":"!"},"logprobs":null,"finish_reason":null}],"usage":null}

            data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[{"index":0,"delta":{},"logprobs":null,"finish_reason":"stop"}],"usage":null}

            data: {"id":"chatcmpl-ADxFKtX6xIwdWRN42QvBj2u1RZpCK","object":"chat.completion.chunk","created":1727889370,"model":"gpt-4o-2024-08-06","system_fingerprint":"fp_f85bea6784","choices":[],"usage":{"prompt_tokens":8,"completion_tokens":9,"total_tokens":17}}

            data: [DONE]

            """;

        using VerbatimHttpHandler handler = new(Input, Output);
        using HttpClient httpClient = new(handler);
        using IChatClient client = CreateChatClient(httpClient, "gpt-4o-mini");

        List<ChatResponseUpdate> updates = [];
        await foreach (var update in client.GetStreamingResponseAsync("hello", new()
        {
            MaxOutputTokens = 20,
            Temperature = 0.5f,
            ModelId = "gpt-4o",
        }))

Copy link
Member

@tarekgh tarekgh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left minor question, LGTM!

@tarekgh
Copy link
Member

tarekgh commented Mar 1, 2026

@stephentoub looks we need to get the new version package added to our infra internal NuGet cache. I used to ask the infra first responders to help in that, but I don't know if we still doing that.

src/Libraries/Microsoft.Extensions.AI.OpenAI/Microsoft.Extensions.AI.OpenAI.csproj(0,0): error NU1102: (NETCORE_ENGINEERING_TELEMETRY=Restore) Unable to find package OpenAI with version (>= 2.9.0)

@tarekgh
Copy link
Member

tarekgh commented Mar 1, 2026

@stephentoub looks we need to get the new version package added to our infra internal NuGet cache. I used to ask the infra first responders to help in that, but I don't know if we still doing that.

src/Libraries/Microsoft.Extensions.AI.OpenAI/Microsoft.Extensions.AI.OpenAI.csproj(0,0): error NU1102: (NETCORE_ENGINEERING_TELEMETRY=Restore) Unable to find package OpenAI with version (>= 2.9.0)

I am seeing you already did 😄

@stephentoub stephentoub closed this Mar 1, 2026
@stephentoub stephentoub reopened this Mar 1, 2026
@stephentoub stephentoub enabled auto-merge (squash) March 1, 2026 19:31
@stephentoub stephentoub requested a review from a team as a code owner March 1, 2026 22:06
@stephentoub
Copy link
Member Author

Blocked on openai/openai-dotnet#991

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area-ai Microsoft.Extensions.AI libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants