Skip to content

Releases: Query-farm/vgi-rpc-python

v0.1.9

03 Mar 05:52

Choose a tag to compare

What's New

max_stream_response_time for time-based stream batching

Producer stream responses can now buffer multiple batches up to a configurable wall-time limit before emitting a continuation token:

app = make_wsgi_app(
    server,
    max_stream_response_time=2.0,  # buffer up to 2 seconds
)

Can be combined with max_stream_response_bytes — the response breaks on whichever limit is reached first.

When neither limit is set, each produce cycle still emits one batch per HTTP response for incremental streaming (unchanged default behavior).

v0.1.8

03 Mar 00:18

Choose a tag to compare

What's New

--http mode for run_server()

Workers using run_server() can now serve over HTTP with zero code changes by passing CLI flags:

my-worker --http                          # auto-select port
my-worker --http --port 8080              # fixed port
my-worker --http --host 0.0.0.0 --port 8080  # bind to all interfaces

Without --http, behavior is unchanged (stdin/stdout pipe transport).

serve_http() convenience function

New serve_http() in vgi_rpc.http wraps make_wsgi_app() + waitress with automatic free-port selection:

from vgi_rpc.http import serve_http
from vgi_rpc.rpc import RpcServer

server = RpcServer(MyProtocol, MyImpl())
serve_http(server, host="127.0.0.1", port=0)  # prints PORT:<n> to stdout

Requires pip install vgi-rpc[http].

v0.1.7

02 Mar 22:43

Choose a tag to compare

What's Changed

  • Cache-Control headers on GET pages: Landing page and describe page now return Cache-Control: no-cache, no-store, must-revalidate, max-age=0 to prevent browser caching. POST endpoints are unaffected.
  • Landing page layout: Moved "Powered by vgi-rpc" info from the page body into the footer for a cleaner layout.

v0.1.6

02 Mar 22:17

Choose a tag to compare

What's Changed

HTTP transport fix

  • Add _DrainRequestMiddleware to ensure the HTTP request body is fully consumed before the WSGI response is returned. Fixes TypeError("Can't read from request stream after response has been sent.") on Cloudflare Workers when error paths (invalid tokens, malformed requests) returned a response without draining the request body.

Full Changelog: v0.1.5...v0.1.6

v0.1.5

28 Feb 15:26

Choose a tag to compare

What's Changed

UTF-8 safe Arrow IPC metadata

  • Base64-encode the binary state token in vgi_rpc.stream_state#b64 metadata to comply with Arrow IPC's UTF-8 requirement for cross-language compatibility. Keys ending in #b64 signal that the value is base64-encoded binary data.

HTTP transport improvements

  • Add Accept-Encoding: zstd to HTTP client requests when compression is enabled
  • Remove unnecessary _drain_stream call from HTTP stream exchange

Conformance suite

  • Add rich multi-type headers and dynamic schema streams to conformance suite
  • Add 5-second per-test timeout to conformance suite and cross-language docs
  • Remove pipe-only skip from large data conformance tests

Other

  • Add optional Sentry integration for error reporting
  • Extract parameter descriptions from docstrings for __describe__ introspection
  • Fix debug logging on Windows during interpreter shutdown (closed stderr)

Full Changelog: v0.1.4...v0.1.5

v0.1.4

26 Feb 00:05

Choose a tag to compare

Full Changelog: v0.1.3...v0.1.4

v0.1.3

25 Feb 01:00

Choose a tag to compare

Full Changelog: v0.1.2...v0.1.3

v0.1.2

21 Feb 19:54

Choose a tag to compare

Full Changelog: v0.1.1...v0.1.2

v0.1.1

21 Feb 18:47

Choose a tag to compare

Full Changelog: v0.1.0...v0.1.1

v0.1.0

21 Feb 18:45

Choose a tag to compare