-
Notifications
You must be signed in to change notification settings - Fork 42
feat: stale-if-error support #1307
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
TartanLlama
wants to merge
40
commits into
main
Choose a base branch
from
sy/stale-if-error
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
40 commits
Select commit
Hold shift + click to select a range
e47fd27
Initial work
TartanLlama 6809d57
Matching Rust (partial)
TartanLlama 2938bb8
Finish draft implementation
TartanLlama 7be7b94
Don't overwrite cache override
TartanLlama 899a6ab
Merge branch 'main' into sy/stale-if-error
TartanLlama 23917ac
Compile clean
TartanLlama dbf5729
Format
TartanLlama f66c0ce
Compile
TartanLlama e552d72
Fmt
TartanLlama 439b2ce
Update runtime/fastly/builtins/fetch/fetch.cpp
TartanLlama 247c3b4
Correct symbol name
TartanLlama d933b0b
Almost there
TartanLlama 646023b
Revert test change
TartanLlama d24ab07
Rebase
TartanLlama cd2ba1b
Merge branch 'main' into sy/stale-if-error
TartanLlama 44213da
Fix test
TartanLlama 0e74678
Revert
TartanLlama 6c6e329
Revert
TartanLlama c043153
Revert
TartanLlama 3294384
fmt
TartanLlama 41613f7
Remove debug output
TartanLlama 0f20fb0
Cleanup
TartanLlama d45c030
fmt
TartanLlama ec666e4
Fmt
TartanLlama 68bf8f5
fmt
TartanLlama 2a5e865
Merge branch 'main' into sy/stale-if-error
TartanLlama 6853ac0
Cleanup
TartanLlama 094cca1
Cleanup
TartanLlama 8b16dde
Merge branch 'sy/stale-if-error' of github.com:fastly/js-compute-runt…
TartanLlama 4145bb6
Cleanup
TartanLlama b1ae77f
Cleanup
TartanLlama b738ab8
Complete implementation
TartanLlama 5b16bb3
fmt
TartanLlama a5d1f9f
tests
TartanLlama c45e6f6
fmt
TartanLlama 62e5794
Docs
TartanLlama cab0957
Merge branch 'main' into sy/stale-if-error
TartanLlama c4c7c32
Update runtime/fastly/builtins/fetch/request-response.cpp
TartanLlama 499c392
Remove special-casing 5XX errors
TartanLlama 27c9332
Feedback
TartanLlama File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
353 changes: 353 additions & 0 deletions
353
integration-tests/js-compute/fixtures/app/src/stale-if-error.js
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,353 @@ | ||
| /// <reference path="../../../../../types/index.d.ts" /> | ||
| /* eslint-env serviceworker */ | ||
|
|
||
| import { CacheOverride } from 'fastly:cache-override'; | ||
| import { assert, assertThrows, strictEqual } from './assertions.js'; | ||
| import { isRunningLocally, routes } from './routes.js'; | ||
|
|
||
| // CacheOverride staleIfError property | ||
| { | ||
| routes.set( | ||
| '/stale-if-error/cache-override/constructor-with-staleIfError', | ||
| async () => { | ||
| const override = new CacheOverride('override', { staleIfError: 300 }); | ||
| assert( | ||
| override.staleIfError, | ||
| 300, | ||
| `new CacheOverride('override', { staleIfError: 300 }).staleIfError === 300`, | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/cache-override/constructor-without-staleIfError', | ||
| async () => { | ||
| const override = new CacheOverride('override', { ttl: 300 }); | ||
| assert( | ||
| override.staleIfError, | ||
| undefined, | ||
| `new CacheOverride('override', { ttl: 300 }).staleIfError === undefined`, | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set('/stale-if-error/cache-override/set-staleIfError', async () => { | ||
| const override = new CacheOverride('override', {}); | ||
| override.staleIfError = 600; | ||
| assert( | ||
| override.staleIfError, | ||
| 600, | ||
| `Setting override.staleIfError = 600 works correctly`, | ||
| ); | ||
| }); | ||
| } | ||
|
|
||
| // Response staleIfError property | ||
| { | ||
| routes.set( | ||
| '/stale-if-error/response/property-undefined-on-non-cached', | ||
| async () => { | ||
| const response = new Response('test body', { | ||
| status: 200, | ||
| headers: { 'Content-Type': 'text/plain' }, | ||
| }); | ||
| assert( | ||
| response.staleIfError, | ||
| undefined, | ||
| `Non-cached response.staleIfError === undefined`, | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/response/setter-throws-on-non-cached', | ||
| async () => { | ||
| const response = new Response('test body'); | ||
| assertThrows( | ||
| () => { | ||
| response.staleIfError = 300; | ||
| }, | ||
| TypeError, | ||
| 'Response set: staleIfError must be set only on unsent cache transaction responses', | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/response/staleIfErrorAvailable-throws-outside-afterSend', | ||
| async () => { | ||
| const response = new Response('test body'); | ||
| assertThrows( | ||
| () => { | ||
| response.staleIfErrorAvailable(); | ||
| }, | ||
| TypeError, | ||
| 'Response: staleIfErrorAvailable() must can only be called on candidate responses inside afterSend callback', | ||
| ); | ||
| }, | ||
| ); | ||
| } | ||
|
|
||
| // Integration tests with fetch | ||
| routes.set('/stale-if-error/fetch/with-cache-override', async () => { | ||
| const url = 'https://http-me.fastly.dev/now?stale-if-error-test-1'; | ||
|
|
||
| // First request: populate cache with staleIfError | ||
| const response1 = await fetch(url, { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 300, | ||
| staleIfError: 600, | ||
| }), | ||
| }); | ||
|
|
||
| assert( | ||
| response1.staleIfError, | ||
| 600, | ||
| `First response has staleIfError === 600`, | ||
| ); | ||
| assert(typeof response1.ttl, 'number', `First response has numeric ttl`); | ||
| }); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/fetch/staleIfErrorAvailable-after-caching', | ||
| async () => { | ||
| const sharedCacheKey = 'stale-if-error-available-test-' + Date.now(); | ||
|
|
||
| // Step 1: Prime the cache with a response that has staleIfError | ||
| const primeRequest = new Request('https://http-me.fastly.dev/now', { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 1, // 1 second TTL - will be stale quickly | ||
| staleIfError: 600, // Long stale-if-error window | ||
| }), | ||
| }); | ||
| primeRequest.setCacheKey(sharedCacheKey); | ||
| await fetch(primeRequest); | ||
|
|
||
| // Step 2: Wait for the cached response to go stale | ||
| await new Promise((resolve) => setTimeout(resolve, 1500)); | ||
|
|
||
| // Step 3: Make a second request with same cache key | ||
| // There's now a stale cached response with staleIfError available | ||
| let staleIfErrorAvailableResult; | ||
| let staleIfErrorValue; | ||
|
|
||
| const checkRequest = new Request('https://http-me.fastly.dev/now', { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| staleIfError: 600, | ||
| afterSend(candidateResponse) { | ||
| // Check staleIfErrorAvailable() on the candidate response | ||
| staleIfErrorAvailableResult = | ||
| candidateResponse.staleIfErrorAvailable(); | ||
| staleIfErrorValue = candidateResponse.staleIfError; | ||
| }, | ||
| }), | ||
| }); | ||
| checkRequest.setCacheKey(sharedCacheKey); | ||
| await fetch(checkRequest); | ||
|
|
||
| assert( | ||
| staleIfErrorAvailableResult, | ||
| true, | ||
| `staleIfErrorAvailable() returns true when stale cached response with staleIfError exists`, | ||
| ); | ||
| assert( | ||
| staleIfErrorValue, | ||
| 600, | ||
| `Cached response preserves staleIfError === 600`, | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/fetch/staleIfErrorAvailable-false-without-staleIfError', | ||
| async () => { | ||
| const url = `https://http-me.fastly.dev/now?stale-if-error-test-no-sie-${Date.now()}`; | ||
|
|
||
| let staleIfErrorAvailableResult; | ||
|
|
||
| await fetch(url, { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 10, | ||
| // No staleIfError configured | ||
| afterSend(candidateResponse) { | ||
| // Check staleIfErrorAvailable() on the candidate response | ||
| staleIfErrorAvailableResult = | ||
| candidateResponse.staleIfErrorAvailable(); | ||
| }, | ||
| }), | ||
| }); | ||
|
|
||
| assert( | ||
| staleIfErrorAvailableResult, | ||
| false, | ||
| `staleIfErrorAvailable() returns false when staleIfError is not configured`, | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set('/stale-if-error/fetch/with-swr-and-staleIfError', async () => { | ||
| const url = 'https://http-me.fastly.dev/now?stale-if-error-test-4'; | ||
|
|
||
| const response = await fetch(url, { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 60, | ||
| swr: 300, | ||
| staleIfError: 600, | ||
| }), | ||
| }); | ||
|
|
||
| assert(response.ttl, 60, `response.ttl === 60`); | ||
| assert(response.swr, 300, `response.swr === 300`); | ||
| assert(response.staleIfError, 600, `response.staleIfError === 600`); | ||
| }); | ||
|
|
||
| routes.set('/stale-if-error/fetch/zero-staleIfError', async () => { | ||
| const url = 'https://http-me.fastly.dev/now?stale-if-error-test-5'; | ||
|
|
||
| const response = await fetch(url, { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 300, | ||
| staleIfError: 0, | ||
| }), | ||
| }); | ||
|
|
||
| assert( | ||
| response.staleIfError, | ||
| 0, | ||
| `response.staleIfError === 0 when explicitly set to zero`, | ||
| ); | ||
| }); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/fetch/serve-stale-on-afterSend-exception', | ||
| async () => { | ||
| const sharedCacheKey = 'stale-if-error-aftersend-exception-' + Date.now(); | ||
|
|
||
| // Step 1: Cache a successful response with short TTL and long stale-if-error | ||
| const goodRequest = new Request('https://http-me.fastly.dev/now', { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 1, // 1 second TTL - will be stale quickly | ||
| staleIfError: 3600, // 1 hour stale-if-error window | ||
| }), | ||
| }); | ||
| goodRequest.setCacheKey(sharedCacheKey); | ||
|
|
||
| const goodResponse = await fetch(goodRequest); | ||
| assert(goodResponse.status, 200, 'Initial response is 200'); | ||
| const initialBody = await goodResponse.text(); | ||
|
|
||
| // Step 2: Wait for TTL to expire (make response stale) | ||
| await new Promise((resolve) => setTimeout(resolve, 1500)); | ||
|
|
||
| // Step 3: Request with afterSend hook that throws an exception | ||
| const errorRequest = new Request('https://http-me.fastly.dev/now', { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| staleIfError: 3600, | ||
| afterSend(response) { | ||
| throw new Error('afterSend hook intentionally failed'); | ||
| }, | ||
| }), | ||
| }); | ||
| errorRequest.setCacheKey(sharedCacheKey); | ||
|
|
||
| const staleResponse = await fetch(errorRequest); | ||
|
|
||
| // Step 4: Verify we got the stale 200 response despite afterSend throwing | ||
| assert( | ||
| staleResponse.status, | ||
| 200, | ||
| 'Stale-if-error serves cached response when afterSend throws', | ||
| ); | ||
|
|
||
| const cachedBody = await staleResponse.text(); | ||
| strictEqual( | ||
| cachedBody, | ||
| initialBody, | ||
| 'Response body is from the original cached response', | ||
| ); | ||
|
|
||
| // The masked error should be the exception that was thrown | ||
| assert( | ||
| staleResponse.maskedError instanceof Error, | ||
| true, | ||
| 'The masked error is an Error object', | ||
| ); | ||
| strictEqual( | ||
| staleResponse.maskedError.message, | ||
| 'afterSend hook intentionally failed', | ||
| 'The masked error message matches the thrown exception', | ||
| ); | ||
| }, | ||
| ); | ||
|
|
||
| routes.set( | ||
| '/stale-if-error/fetch/serve-stale-on-beforeSend-exception', | ||
| async () => { | ||
| const sharedCacheKey = 'stale-if-error-beforesend-exception-' + Date.now(); | ||
|
|
||
| // Step 1: Cache a successful response with short TTL and long stale-if-error | ||
| const goodRequest = new Request('https://http-me.fastly.dev/now', { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| ttl: 1, // 1 second TTL - will be stale quickly | ||
| staleIfError: 3600, // 1 hour stale-if-error window | ||
| }), | ||
| }); | ||
| goodRequest.setCacheKey(sharedCacheKey); | ||
|
|
||
| const goodResponse = await fetch(goodRequest); | ||
| assert(goodResponse.status, 200, 'Initial response is 200'); | ||
| const initialBody = await goodResponse.text(); | ||
|
|
||
| // Step 2: Wait for TTL to expire (make response stale) | ||
| await new Promise((resolve) => setTimeout(resolve, 1500)); | ||
|
|
||
| // Step 3: Request with beforeSend hook that throws an exception | ||
| const errorRequest = new Request('https://http-me.fastly.dev/now', { | ||
| backend: 'httpme', | ||
| cacheOverride: new CacheOverride('override', { | ||
| staleIfError: 3600, | ||
| beforeSend(request) { | ||
| throw new Error('beforeSend hook intentionally failed'); | ||
| }, | ||
| }), | ||
| }); | ||
| errorRequest.setCacheKey(sharedCacheKey); | ||
|
|
||
| const staleResponse = await fetch(errorRequest); | ||
|
|
||
| // Step 4: Verify we got the stale 200 response despite beforeSend throwing | ||
| assert( | ||
| staleResponse.status, | ||
| 200, | ||
| 'Stale-if-error serves cached response when beforeSend throws', | ||
| ); | ||
|
|
||
| const cachedBody = await staleResponse.text(); | ||
| strictEqual( | ||
| cachedBody, | ||
| initialBody, | ||
| 'Response body is from the original cached response', | ||
| ); | ||
|
|
||
| // The masked error should be the exception that was thrown | ||
| assert( | ||
| staleResponse.maskedError instanceof Error, | ||
| true, | ||
| 'The masked error is an Error object', | ||
| ); | ||
| strictEqual( | ||
| staleResponse.maskedError.message, | ||
| 'beforeSend hook intentionally failed', | ||
| 'The masked error message matches the thrown exception', | ||
| ); | ||
| }, | ||
| ); | ||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: Is this the same criterion as "candidate responses", below?
In the Rust universe, we have separate types for the two, but I think they're the same in JS -- the restriction makes sense here, just might be clearer if it's using the same language.