Skip to content

feat(responses): add cancel() support for streaming responses#2916

Open
giulio-leone wants to merge 6 commits intoopenai:mainfrom
giulio-leone:feat/streaming-response-cancel
Open

feat(responses): add cancel() support for streaming responses#2916
giulio-leone wants to merge 6 commits intoopenai:mainfrom
giulio-leone:feat/streaming-response-cancel

Conversation

@giulio-leone
Copy link

Summary

Adds the ability to cancel streaming responses mid-stream via a new cancel() method on ResponseStream and AsyncResponseStream.

Fixes #2643

Changes

  • Added response_id property to ResponseStream and AsyncResponseStream — returns the response ID from the accumulated snapshot (available after the first response.created event)
  • Added cancel() method to ResponseStream and async cancel() to AsyncResponseStream — calls the cancel API endpoint and closes the stream
  • Added current_snapshot property to ResponseStreamState to cleanly expose the internal snapshot
  • Passed a cancel_response callback from Responses.stream() and AsyncResponses.stream() to the stream managers and stream classes, using a callback pattern to avoid coupling stream classes to the resource layer
  • Added 11 unit tests covering: response_id availability, error cases, callback invocation, and stream closure

Usage

# Sync
with client.responses.stream(model="gpt-4o", input="Write a long essay") as stream:
    for event in stream:
        if should_stop(event):
            cancelled = stream.cancel()
            break

# Async
async with client.responses.stream(model="gpt-4o", input="Write a long essay") as stream:
    async for event in stream:
        if should_stop(event):
            cancelled = await stream.cancel()
            break

Backwards Compatibility

  • All new parameters use None defaults — fully backwards compatible
  • No changes to existing method signatures or behavior

Adds response_id property and cancel() method to ResponseStream
and AsyncResponseStream, allowing users to cancel a response
mid-stream. Uses a callback pattern to avoid coupling stream
classes to the Responses resource.

Refs: openai#2643
Copilot AI review requested due to automatic review settings March 2, 2026 11:54
@giulio-leone giulio-leone requested a review from a team as a code owner March 2, 2026 11:54
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b77dd8cfd6

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Adds first-class cancellation support for streaming Responses by exposing response_id and a new cancel() API on both sync and async streaming helpers, wired through the Responses.stream() / AsyncResponses.stream() managers.

Changes:

  • Add response_id property on ResponseStream / AsyncResponseStream (backed by a new ResponseStreamState.current_snapshot accessor).
  • Add cancel() / async cancel() on stream objects that invokes the Responses cancel endpoint via a passed callback and closes the stream.
  • Add unit tests covering response_id, cancel preconditions, callback invocation, and snapshot exposure.

Reviewed changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.

File Description
src/openai/lib/streaming/responses/_responses.py Adds cancel callback plumbing, response_id, cancel() methods, and exposes current_snapshot.
src/openai/resources/responses/responses.py Wires self.cancel into stream managers so streaming objects can call the cancel endpoint.
tests/lib/responses/test_response_stream_cancel.py Adds unit tests for cancellation behavior and snapshot/ID availability.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

giulio-leone added a commit to giulio-leone/openai-python that referenced this pull request Mar 2, 2026
- Break long ResponseStreamManager return into multi-line (ruff line-length)
- Close stream before issuing cancel API call to release the HTTP connection
- Prevents connection pool deadlock with constrained clients (e.g. max_connections=1)
- Applied to both sync ResponseStream.cancel() and async AsyncResponseStream.cancel()

Refs: openai#2916
Use try/finally so that stream.close() always runs even if the cancel
API call raises an exception.  Applies to both sync and async paths.

Refs: openai#2916
@giulio-leone giulio-leone force-pushed the feat/streaming-response-cancel branch from 111faaa to aad7295 Compare March 2, 2026 16:25
giulio-leone and others added 4 commits March 2, 2026 18:28
Remove the pre-try await self.close() that would close the stream before
the cancel API call, and keep only the finally block to ensure close() is
always called regardless of cancel API success/failure.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Cancel for streaming Responses

2 participants