Add presigned S3 URL support for large blob payloads#66
Merged
Conversation
Objects stored under the cloudevent/blobs/ key prefix are now served
via a short-lived presigned S3 GET URL (dataUrl field) instead of being
downloaded and embedded inline in the GraphQL response. This avoids
ballooning the JSON payload for large binary objects (e.g. scans).
- Add Presigner interface + PresignBlobURL method to eventrepo.Service
- Export BlobKeyPrefix constant ("cloudevent/blobs/")
- Add dataUrl: String field to GraphQL CloudEvent type
- Detect blob prefix in LatestCloudEvent / CloudEvents resolvers;
skip GetObject and presign against the primary bucket instead
- Wire s3.NewPresignClient into both GraphQL and gRPC app paths
- Update eventrepo.New signature; pass nil in tests that don't need presigning
- Add MockPresigner to generated mock file
https://claude.ai/code/session_015ReeLGeCywJfYkkrrng5wU
Describe the field in terms of behavior (large files) rather than the internal key prefix convention, which is an implementation detail that may change. https://claude.ai/code/session_015ReeLGeCywJfYkkrrng5wU
- pkg/eventrepo/presign_test.go: unit tests for PresignBlobURL covering correct bucket/key routing, 15-minute TTL, presigner error propagation, and nil-presigner guard - internal/graph/blob_resolver_test.go: unit tests for the DataUrl resolver (nil wrapper, empty DataURL, populated DataURL) and a composite test that a blob wrapper returns nil for data/dataBase64 and a URL for dataUrl https://claude.ai/code/session_015ReeLGeCywJfYkkrrng5wU
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
This PR adds support for serving large binary blob payloads via presigned S3 URLs instead of inline in GraphQL responses. When a cloud event's payload is stored with the blob prefix, a short-lived presigned URL is returned in the new
dataUrlfield instead of the raw data.Key Changes
dataUrl: Stringfield to theCloudEventtype to return presigned S3 URLs for blob payloadsPresignerinterface for generating presigned S3 GET URLsBlobKeyPrefixconstant ("cloudevent/blobs/") to identify blob objectsPresignBlobURL()method to generate 15-minute presigned URLsNew()constructor to accept a presigner dependencyLatestCloudEvent()to detect blob keys and return presigned URLs instead of fetching full payloadsCloudEvents()to handle mixed blob and non-blob events, presigning blob URLs while fetching regular event dataDataURLfield to carry presigned URLs through the resolver chainDataUrl()resolver that returns the presigned URL from the wrappers3.NewPresignClient()Implementation Details
eventrepo.BlobKeyPrefix)presignTTL)CloudEventsquery efficiently separates blob and non-blob events, presigning blobs while concurrently fetching non-blob payloadsnilpresigner where not neededhttps://claude.ai/code/session_015ReeLGeCywJfYkkrrng5wU