You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -133,7 +133,7 @@ import handler from "./handler.js";
133
133
addEventListener("fetch", (event) => {
134
134
const handle =createCacheHandler({
135
135
handler: handleRequest,
136
-
runInBackground: event.waitUntil,
136
+
runInBackground: event.waitUntil.bind(event),
137
137
});
138
138
event.respondWith(handle(event.request));
139
139
});
@@ -284,13 +284,97 @@ import type {
284
284
} from"cache-handlers";
285
285
```
286
286
287
+
## Important Caveats & behaviour
288
+
289
+
### Race Conditions
290
+
291
+
**Concurrent Cache Writes**: Multiple simultaneous requests for the same resource may result in duplicate cache writes. The last write wins, but all requests will complete successfully. This is generally harmless but may cause temporary inconsistency during high concurrency.
292
+
293
+
**Background Revalidation**: When using `stale-while-revalidate`, multiple concurrent requests during the SWR window will each trigger their own background revalidation. The library does not deduplicate these - each will run independently. Consider using request deduplication at the application level if this is a concern.
294
+
295
+
**Invalidation During Revalidation**: Cache invalidation operations may occur while background revalidation is in progress. The invalidation will complete immediately, but in-flight revalidations may still write back to the cache, potentially restoring stale data.
296
+
297
+
### Platform-Specific CacheStorage behaviour
298
+
299
+
Different platforms implement the Web Standard `CacheStorage` API with varying capabilities and limitations:
300
+
301
+
#### Cloudflare Workers
302
+
```ts
303
+
// ✅ Full support for all features
304
+
// ✅ Persistent across requests within the same data center
305
+
// ✅ Automatic geographic distribution
306
+
// ⚠️ Cache keys limited to ~8KB total URL length
307
+
// ⚠️ Cache entries expire after ~1 hour of inactivity
308
+
```
309
+
310
+
#### Deno Deploy
311
+
```ts
312
+
// ✅ Full CacheStorage support
313
+
// ✅ Persistent across deployments in same region
314
+
// ⚠️ Regional caches - not globally distributed
315
+
// ⚠️ Cache may not persist during deployment updates
316
+
```
317
+
318
+
#### Node.js (with undici polyfill)
319
+
```ts
320
+
// ✅ Works via undici polyfill
321
+
// ⚠️ In-memory only by default - not persistent across restarts
322
+
// ⚠️ Limited to single process - no cross-process sharing
323
+
// 💡 Consider using Redis or similar for production Node.js deployments
324
+
```
325
+
326
+
#### Netlify Edge Functions
327
+
```ts
328
+
// ✅ CacheStorage available
329
+
// ⚠️ Cache is per-edge location, not globally consistent
330
+
// ⚠️ Cache may be cleared during deployments
331
+
```
332
+
333
+
#### Vercel Edge Runtime
334
+
```ts
335
+
// ❌ CacheStorage not available
336
+
// 💡 Use Vercel's built-in caching mechanisms instead
337
+
```
338
+
339
+
### Memory and Performance Considerations
340
+
341
+
**Large Responses**: Caching large responses (>1MB) may impact performance and memory usage. Consider streaming or chunked responses for large payloads.
342
+
343
+
**Cache Key Generation**: Complex cache key generation (e.g., with many vary parameters) can impact performance. Keep cache keys simple when possible.
344
+
345
+
**Metadata Overhead**: Cache tag metadata is stored separately and may grow large with many tagged entries. Monitor cache statistics and clean up unused tags periodically.
346
+
347
+
### Debugging and Monitoring
348
+
349
+
Enable debug logging to understand cache behaviour:
0 commit comments