Skip to content

fix: support http_client factory in OpenAIModel client_args#1893

Open
giulio-leone wants to merge 1 commit intostrands-agents:mainfrom
giulio-leone:fix/openai-http-client-factory
Open

fix: support http_client factory in OpenAIModel client_args#1893
giulio-leone wants to merge 1 commit intostrands-agents:mainfrom
giulio-leone:fix/openai-http-client-factory

Conversation

@giulio-leone
Copy link
Contributor

Problem

When http_client is passed as an httpx.AsyncClient instance in client_args, it gets closed after the first async with block. Subsequent requests fail because the same closed client instance is reused.

Users cannot pass a custom HTTP client to the OpenAI provider without it breaking on the second invocation.

Closes #1036

Solution

Support http_client as either:

  • Instance (existing behavior): passed through unchanged — the httpx.AsyncClient is used directly
  • Callable factory (new): invoked on every request to produce a fresh client

The distinction uses hasattr(http_client, 'send') — real HTTP clients have a .send() method, while factory functions/lambdas do not.

Usage

# Factory pattern — fresh client on every request (recommended)
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": "your-key",
        "http_client": lambda: httpx.AsyncClient(timeout=30),
    },
)

# Instance pattern — still works, caller manages lifecycle
model = OpenAIModel(
    model_id="gpt-4o",
    client_args={
        "api_key": "your-key",
        "http_client": my_custom_client,
    },
)

Changes

  • src/strands/models/openai.py: Updated _get_client() to resolve factory callables; updated docstring
  • tests/strands/models/test_openai.py: 3 new tests (factory invocation, instance passthrough, no mutation)

Testing

  • All 1869 tests pass
  • New tests verify factory is called on each request, instances pass through unchanged, and original client_args dict is not mutated

When http_client is passed as an httpx.AsyncClient instance in
client_args, it gets closed after the first async with block, causing
subsequent requests to fail with a closed client error.

This adds support for passing http_client as a callable factory (e.g.,
lambda: httpx.AsyncClient(...)) that produces a fresh client on each
request. Instance clients (with a .send method) are still passed through
unchanged for backwards compatibility.

Closes strands-agents#1036
@giulio-leone giulio-leone force-pushed the fix/openai-http-client-factory branch from cc81b65 to ccf0819 Compare March 15, 2026 16:12
@github-actions github-actions bot added size/s and removed size/s labels Mar 15, 2026
@giulio-leone
Copy link
Contributor Author

Friendly ping — adds support for http_client factory functions in OpenAIModel client_args, enabling custom transport/proxy configuration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[BUG] OpenAI Implementation does not support passing custom HTTP client

1 participant