Cache API
Store responses from web requests made from Functions and Edge Functions, making your application more performant, resilient and cost-efficient.
The Cache API is a great companion to Netlify’s fine-gained cache controls, giving you the power to cache entire routes as well as their individual components.
Built entirely on web standards, it works seamless with any web framework — or without one.
# Overview
The Cache API is a programmatic interface for reading and writing HTTP responses to a cache using the standard CacheStorage and Cache JavaScript APIs.
Just like Functions and Edge Functions, the Cache API operates on Request
and Response
objects and has first-class support for the Fetch API.
You can use the Cache API to cache any resource on the web, whether it's hosted on Netlify or any other provider.
# Features
The Cache API uses standard cache control headers to determine the cache behavior for a response. This includes how long it can be cached for, how it's matched against a request and how it can be invalidated.
- Responses are automatically invalidated once their expiration time (defined by the
max-age
ors-maxage
directives) has elapsed - Responses can be manually invalidated by calling the
delete()
method or by purging any cache tags set on the response - Responses stored with the Cache API are not replicated across regions, meaning that any functions and edge functions running on a given region all share the same cache data, but that data isn’t shared with any functions or edge functions running on a different region
- Responses stored with the Cache API are automatically invalidated when your site is redeployed
- Requests to the Cache API are automatically routed to the closest region for optimal performance
# API reference
The Cache API offers the following methods, a subset of the standard CacheStorage and Cache APIs.
# caches.match
Retrieves a response from any of the Cache
instances, if found.
const response = await caches.match(request);
# Parameters
request
: the request for which you are attempting to find responses in the caches; this can be aRequest
object or a URL string
# Return value
A Promise
that resolves to the Response
associated with the first matching request in any of the caches. If no match is found, the Promise
resolves to undefined
.
# caches.open
Opens a named cache instance where you can store and retrieve responses.
The name parameter defines a namespace for a specific set of cached responses. You should use something meaningful for your application context.
Keep in mind that responses aren't shared between caches, so using multiple names can fragment your cache and reduce your hit ratio.
const cache = await caches.open("my-cache");
# Parameters
name
: name of the cache
# Return value
A Promise
that resolves with a Cache instance.
# cache.add
Takes a URL, retrieves it, and adds the resulting response object to the given cache.
This is an instance method of a Cache
object that must be created with caches.open()
.
const cache = await caches.open("my-cache");
await cache.add(request);
# Parameters
request
: a request for the resource you want to add to the cache; this can be aRequest
object or a URL string
# Return value
A Promise
that resolves with undefined
.
# cache.addAll
Takes an array of URLs, retrieves them, and adds the resulting response objects to the given cache.
This is an instance method of a Cache
object that must be created with caches.open()
.
const cache = await caches.open("my-cache");
const responses = await cache.addAll(requests);
# Parameters
request
: an array of requests for the resource you want to add to the cache; these can be aRequest
object or a URL string
# Return value
A Promise
that resolves with undefined
.
# cache.delete
Finds a response that matches the given request and deletes it from the cache.
This is an instance method of a Cache
object that must be created with caches.open()
.
const cache = await caches.open("my-cache");
await cache.delete(request);
# Parameters
request
: the request you are looking to delete from the cache; this can be aRequest
object or a URL
# Return value
A Promise
that resolves with true
.
# cache.match
Retrieves a response from the cache, if found.
This is an instance method of a Cache
object that must be created with caches.open()
.
const cache = await caches.open("my-cache");
const response = await cache.match(request);
# Parameters
request
: the request for which you are attempting to find responses in the cache; this can be aRequest
object or a URL string
# Return value
A Promise
that resolves to the Response
associated with the first matching request in the cache. If no match is found, the Promise
resolves to undefined
.
# cache.put
Adds a response to the cache.
This is an instance method of a Cache
object that must be created with caches.open()
.
const cache = await caches.open("my-cache");
await cache.put(request, response);
# Parameters
request
: the request to be added to the cache; this can be aRequest
object or a URL stringresponse
: theResponse
you want to match up to the request
# Return value
A Promise
that resolves with undefined
.
# Utility methods
The @netlify/cache
module offers a set of utility methods that you can use on top of the base API to perform common tasks more easily. To use them, start by adding the module to your project using the package manager of your choice:
npm install @netlify/cache
# fetchWithCache
Returns a response for the given request if it's found in the cache. If not, a new request is made and the response is added to the cache.
It's a drop-in replacement for the standard fetch
method with an additional optional parameter for configuring the cache settings of the response that is added to the cache. These options override any conflicting cache settings that the response may define.
fetchWithCache(resource);
fetchWithCache(resource, cacheSettings);
fetchWithCache(resource, options);
fetchWithCache(resource, options, cacheSettings);
# Parameters
response
: the resource that you wish to fetch; this can be either a string, aURL
object or aRequest
options
: a standardRequestInit
object containing any custom settings that you want to apply to the requestcacheSettings
: an object with the different cache settings to be set, with support for the following properties:durable
: A boolean indicating whether to persist the response in the durable cacheoverrideDeployRevalidation
: Opts out of automatic invalidation with atomic deploys by specifying one or more cache tags that can be used for on-demand invalidationswr
: The value for thestale-while-revalidate
directive, representing the amount of time (in seconds) after the response has expired during which it can still be served while it's revalidated in the backgroundtags
: List of cache tags to add to the responsettl
: The value for thes-maxage
directive, representing the maximum amount of time (in seconds) that Netlify will cache the responsevary
: An object containing the parts of the request to vary on, with support for one or more of the following properties:cookie
: List of cookies to vary oncountry
: List of countries to vary on, with nested arrays representing or conditionsheader
: List of headers to vary onlanguage
: List of languages to vary on, with nested arrays representing language combinationsquery
: List of URL query parameters to vary on ortrue
to vary on all query parameters
# Return value
A Promise
that resolves to a Response
object.
# Example
This example shows how you might use the fetchWithCache
utility method to either retrieve a response from the cache or fetch it from the network. When the response is fetched from the network, it is then added to the cache with a set of options.
import { fetchWithCache, DAY } from "@netlify/cache";
import type { Config, Context } from "@netlify/functions";
export default async (req: Request, context: Context) => {
const response = await fetchWithCache("https://example.com/expensive-api", {
ttl: 2 * DAY, // Two days
tags: ["product", "sale"],
vary: {
cookie: ["ab_test_name", "ab_test_bucket"],
query: ["item_id", "page"]
}
});
return response;
};
export const config: Config = {
path: "/fetchwithcache-example"
};
# getCacheStatus
Extracts information from the Cache-Status
header about how the response has interacted with the different components of Netlify's global caching infrastructure.
The returned value makes it straightforward to differentiate cached and uncached responses, which is especially useful when measuring performance.
getCacheStatus(cacheStatusHeader);
getCacheStatus(headers);
getCacheStatus(response);
# Parameters
cacheStatusHeader
: a string containing the values of theCache-Status
header you want to inspectheaders
: aHeaders
object containing theCache-Status
header you want to inspectresponse
: theResponse
object containing theCache-Status
header you want to inspect
# Return value
An object containing the following properties:
hit
: a boolean indicating whether the response has been served from a Netlify cachecaches
: an object with granular information about the different Netlify caches:durable
: an object describing how the response has interacted with the durable cache:hit
: a boolean indicating whether the response has been served by the durable cachestale
: a boolean indicating whether the response matched a stale entry in the cache (i.e. older than the specified age settings)stored
: a boolean indicating whether the response has just been stored in the durable cachettl
: the number of seconds left before the response's expiration date; a negative number represents how long ago the response has expired
edge
: an object describing how the response has interacted with the edge cache:hit
: a boolean indicating whether the response has been served by the edge cachestale
: a boolean indicating whether the response matched a stale entry in the cache (i.e. older than the specified age settings)
# Example
This example shows how you might use the getCacheStatus
utility method to determine whether a response was retrieved from the cache, including information about which of Netlify's caching layers has served the response.
import { fetchWithCache, getCacheStatus } from "@netlify/cache";
import type { Config, Context } from "@netlify/functions";
export default async (req: Request, context: Context) => {
const response = await fetchWithCache("https://example.com/expensive-api");
const { hit, edge, durable } = getCacheStatus(response);
if (hit) {
console.log("Served from the cache:")
console.log(`- Edge cache: ${edge?.hit ? "hit" : "miss"}`);
console.log(`- Durable cache: ${durable?.hit ? "hit" : "miss"}`);
} else {
console.log("Served from the network:")
console.log(`- Edge cache: ${edge?.stale ? "stale" : "not found"}`);
console.log(`- Durable cache: ${durable?.stale ? "stale" : "not found"}`);
}
return response;
};
export const config: Config = {
path: "/getcachestatus-example"
};
# Usage with frameworks
The Cache API has been designed to work seamlessly with any web framework that deploys to Netlify. For the best development experience possible, that are a few important things to note.
The CacheStorage API is exposed through the caches
global variable, as defined by the spec. The API isn't part of Node.js though, which means this variable usually isn't available in Node.js environments.
On Netlify, we expose it automatically in the Netlify Functions and Netlify Edge Functions runtimes, both on live sites and locally with Netlify Dev.
However, some frameworks have a setup where functions are not used in local development; or you might choose to run your framework's own development command and not use the Netlify CLI altogether. In those cases, the caches
global will not be set and trying to access it will cause an error.
To get around that, you can import it from the @netlify/caches
package.
import { caches } from "@netlify/cache";
const cache = await caches.open("my-cache");
await cache.set("https://example.com", new Response("Hello"));
Using this import does not change the functionality in any way, and does not require any further code changes. It's only required for local development in certain setups.
We're working with framework maintainers to make this global variable part of their local development servers, at which point you'll be able to completely remove the import.
# Limits
There is a limit to how many Cache API operations you can perform on a single serverless function or edge function invocation:
- Maximum number of cache lookups per invocation: 100
- Maximum number of cache insertions or deletions per invocation: 20
If you exceed any of these limits, any subsequent cache lookups will not return any response and any insertions or deletions will not actually modify the state of the cache.
When multiple edge functions run for a given request, these limits are shared by all the edge functions.
The limits are not shared between serverless functions and edge functions, which means that each group of functions will have their own quota.
# Notes and limitations
Keep the following limitations in mind when working with the Cache API:
- The
stale-while-revalidate
directive is not supported and will be ignored if present on any of the supported cache control headers - The Cache API is available in the local development environment when using version 20.0.3 or above of the Netlify CLI; note that no cached responses are actually persisted anywhere, which means that lookups will not return a response and insertions or deletions will not mutate any state
- As per the standard Cache API specification, it is not possible to cache partial responses, responses with a
Vary
header set to*
, or responses for a request with a method other thanGET
While the Netlify Cache API tries to follow the standard CacheStorage and Cache APIs as closely as possible, there are some implementation differences associated with operating a cache on a globally-distributed infrastructure instead of the browser:
- The
keys()
method is not implemented and there is currently no way to list the contents of the cache - While reads and writes are strongly consistent, deletions are eventually consistent, so reading an entry after deleting it may still yield the cached response for a short period of time while the deletion is propagated across the network
# Examples
Let's imagine a function that takes some input via a URL query parameter and uses it to make an HTTP request to an external API.
This fictional API is slow and expensive, so it's in our best interest to call it only when absolutely necessary. The following example shows how you might do this with the Cache API.
import type { Config, Context } from "@netlify/functions";
export default async (req: Request, context: Context) => {
const request = new Request("https://example.com/expensive-api");
// Call the external API. This will happen on every request.
const response = await fetch(request);
return response;
};
export const config: Config = {
path: "/cache-api-example"
};
import type { Config, Context } from "@netlify/functions";
const cache = await caches.open("my-cache");
export default async (req: Request, context: Context) => {
const request = new Request("https://example.com/expensive-api");
// Look for the response in the cache.
const cached = await cache.match(request);
if (cached) {
return cached;
}
// It's not in the cache, so let's fetch it.
const fresh = await fetch(request);
// Store it in the cache for future invocations. The response must be cloned
// so that we can simultaneously stream it to the client and to the cache.
if (response.ok) {
cache.put(request, fresh.clone()).catch(error => {
console.error("Failed to add to the cache:", error);
});
}
return fresh;
};
export const config: Config = {
path: "/cache-api-example"
};
import { fetchWithCache } from "@netlify/cache";
import type { Config, Context } from "@netlify/functions";
export default async (req: Request, context: Context) => {
const request = new Request("https://example.com/expensive-api");
// Get the response from the cache if it's there. If not, fetch it
// and store in the cache. This is a convenience method, equivalent
// to the logic in the "Cache API" tab.
const response = await fetchWithCache(request);
return response;
};
export const config: Config = {
path: "/cache-api-example"
};
# Troubleshooting
# Outside handler scope
While you can open a cache instance anywhere in your function code, you can only read, write or delete entries from the cache within the scope of your request handler.
Attempting to perform any of those operations in another scope (such as the global scope) will throw an error.
import type { Config, Context } from "@netlify/functions";
// ✅ This works.
const cache = await caches.open("my-cache");
// ❌ This will throw an error.
const cached = await cache.match("https://example.com");
export default async (req: Request, context: Context) => {
// ✅ This works.
const cached = await cache.match("https://example.com");
if (cached) {
return cached;
}
return new Response("Not in the cache", { status: 404 });
};
export const config: Config = {
path: "/cache-api-example"
};
# Missing cache headers or directives
Responses must have a cache control header with caching directives.
If you are not in control of the server or you don't want to change the response headers it returns, consider using the fetchWithCache
utility method to modify the response headers before the response is added to the Cache API.
# Unsupported Netlify-Vary
directives
Responses must not use unsupported directives of the Netlify-Vary
header.
Please refer to the cache key variation documentation for the full list of supported directives, and the values they accept.
# Unsupported Cache-Control
directives
Responses must not set cache control headers with the private
, no-cache
or no-store
directives, as these directives indicate that the response cannot be stored in a public cache without validating it with the origin server before each reuse.
Refer to the Cache-Control
directives documentation for more information on the different cache control directives.
Consider removing these directives from your response. If you are not in control of the server or you don't want to change the response headers it returns, consider using the fetchWithCache
utility method to modify the response headers before the response is added to the Cache API.
# Invalid maximum age
Responses must have a cache control header with a max-age
or s-maxage
directive of at least 1 second.
Consider updating the cache control headers to include this directive with a supported value. If you are not in control of the server or you don't want to change the response headers it returns, consider using the fetchWithCache
utility method to modify the response headers before the response is added to the Cache API.
# Missing status code
Responses must specify a status code. Please ensure you're passing a valid Response
object to the Cache API.
# Invalid status code
Responses must have a status code between 200 and 299.
Consider checking the status of the response before storing it with the Cache API.
const response = await fetch("https://example.com");
if (response.ok) {
await cache.put(request, response);
}
# Internal error
There was an internal error that prevented your Cache API operation to be completed.
Please use our support page to report your problem.
Did you find this doc useful?
Your feedback helps us improve our docs.