Platform primitives /Functions /

Get started with functions

This page will help you get started with Functions. It describes how to write your functions and route requests to them.

Choose your programming language:

# Prepare project

Start by adding the @netlify/functions module to your project, which exports all the types you need to create type-safe functions.

npm install @netlify/functions

You don’t need any additional tooling or configuration to use TypeScript functions, but you can choose to provide your own tsconfig.json file if you want to extend the base configuration in order to rewrite import paths, for example.

Our build system will load any tsconfig.json files from your functions directory, the repository root directory, or the base directory, if set.

# Create function file

To add a serverless function to your project, create a TypeScript file in your functions directory.

You can store your function file directly under the functions directory or in a subdirectory dedicated to the function. If you choose a subdirectory, the function entry file must be named index or have the same name as the subdirectory.

For example, any of the following files would create a function called hello:

  • netlify/functions/hello.mts
  • netlify/functions/hello/hello.mts
  • netlify/functions/hello/index.mts

Using ES modules

Naming your function with the .mts extension lets you use the modern ES modules syntax. To learn more about the different module formats, refer to runtime.

# Write a function

A function file must be written using the JavaScript modules syntax and have a default export with a handler function.

The handler function receives the following arguments:

For synchronous functions, the return value of the handler function may be used as the HTTP response to be delivered to the client.

# Synchronous function

A synchronous function lets you implement a traditional client/server interaction, where the connection is kept open until the function execution is finished, allowing the client to wait for a response before rendering a page or moving on to the next task.

The handler function should return a Response object representing the HTTP response to be delivered to the client, including any caching headers you want to set. If no value is returned, the client will receive an empty response with a 204 status code.

import type { Context } from "@netlify/functions";

export default async (req: Request, context: Context) => {
  return new Response("Hello, world!")
}

Synchronous functions can stream data to clients as it becomes available, rather than returning a buffered payload at the end of the computation. This lets developers and frameworks create faster experiences by using streaming and partial hydration to get content and interactions in front of people as quickly as possible.

To stream a function’s response, return a ReadableStream as the body property of the Response object.

Examples
export default async () => {
  const encoder = new TextEncoder();
  const formatter = new Intl.DateTimeFormat("en", { timeStyle: "medium" });
  const body = new ReadableStream({
    start(controller) {
      controller.enqueue(encoder.encode("<html><body><ol>"));
      let i = 0;
      const timer = setInterval(() => {
        controller.enqueue(
          encoder.encode(
            `<li>Hello at ${formatter.format(new Date())}</li>\n\n`
          )
        );
        if (i++ >= 5) {
          controller.enqueue(encoder.encode("</ol></body></html>"));
          controller.close();
          clearInterval(timer);
        }
      }, 1000);
    }
  });

  return new Response(body);
};
export default async () => {
  // Get the request from the request query string, or use a default
  const pie =
    event.queryStringParameters?.pie ??
    "something inspired by a springtime garden";

  // The response body returned from "fetch" is a "ReadableStream",
  // so you can return it directly in your streaming response
  const res = await fetch("https://api.openai.com/v1/chat/completions", {
    method: "POST",
    headers: {
      "Content-Type": "application/json",
      // Set this environment variable to your own key
      Authorization: `Bearer ${process.env.OPENAI_API_KEY}`
    },
    body: JSON.stringify({
      model: "gpt-3.5-turbo",
      messages: [
        {
          role: "system",
          content:
            "You are a baker. The user will ask you for a pie recipe. You will respond with the recipe. Use markdown to format your response"
        },
        // Use "slice" to limit the length of the input to 500 characters
        { role: "user", content: pie.slice(0, 500) }
      ],
      // Use server-sent events to stream the response
      stream: true
    })
  });

  return new Response(body, {
    headers: {
      // This is the mimetype for server-sent events
      "content-type": "text/event-stream"
    }
  });
};
import Groq from "groq-sdk";

const client = new Groq({
  apiKey: process.env.GROQ_API_KEY
});

export default async () => {
  // Get the request from the request query string, or use a default
  const pie =
    event.queryStringParameters?.pie ??
    "something inspired by a springtime garden";

  // Create a transcription job
  const stream = await client.chat.completions.create({
    // The language model which will generate the completion.
    model: "llama3-8b-8192",
    messages: [
      {
        role: "system",
        content:
          "You are a baker. The user will ask you for a pie recipe. You will respond with the recipe. Use markdown to format your response"
      },
      // Use "slice" to limit the length of the input to 500 characters
      { role: "user", content: pie.slice(0, 500) }
    ],

    //
    // Optional parameters
    //

    // Controls randomness: lowering results in less random completions.
    // As the temperature approaches zero, the model will become deterministic
    // and repetitive.
    temperature: 0.5,

    // The maximum number of tokens to generate. Requests can use up to
    // 2048 tokens shared between prompt and completion.
    max_tokens: 1024,

    // Controls diversity via nucleus sampling: 0.5 means half of all
    // likelihood-weighted options are considered.
    top_p: 1,

    // A stop sequence is a predefined or user-specified text string that
    // signals an AI to stop generating content, ensuring its responses
    // remain focused and concise. Examples include punctuation marks and
    // markers like "[end]".
    stop: null,

    // If set, partial message deltas will be sent.
    stream: true
  });

  // Wrap the Stream<ChatCompletionChunk> in a ReadableStream
  const readableStream = new ReadableStream({
    async start(controller) {
      for await (const chunk of stream) {
        // Enqueue the chunk into the ReadableStream
        controller.enqueue(
          new TextEncoder().encode(chunk.choices[0]?.delta?.content || "")
        );
      }

      controller.close(); // Close the stream when it's done
    }
  });

  return new Response(readableStream, {
    headers: {
      // This is the mimetype for server-sent events
      "content-type": "text/event-stream"
    }
  });
};

When returning a stream, keep the following limitations in mind:

  • 10 second execution limit. If the limit is reached, the response stops streaming.
  • 20 MB response size limit. Responses larger than 20 MB cannot be streamed.

# Background function

This feature is in Beta and is available on Core Pro and Enterprise plans.

With background functions, the function invocation is placed into a queue and the client connection is terminated immediately. This pattern lets you perform longer-running operations without forcing clients to wait for a response.

The handler function does not need to return anything, as the client will always receive an empty response with a 202 status code. Any response returned by the handler function will be ignored.

import { Context } from "@netlify/functions";

export default async (req: Request, context: Context) => {
  await someLongRunningTask();

  console.log("Done");
};

To define a background function, the name of the function needs to have a -background suffix (for example, netlify/functions/hello-background.mts or netlify/functions/hello-background/index.mts).

# Route requests

Netlify automatically creates a dedicated endpoint for every function you create, using the format https://<YOUR DOMAIN>/.netlify/functions/<FUNCTION NAME>.

Additionally, you can configure the function to run on any path of your choice by defining a path property in the config export of your function.

import { Config, Context } from "@netlify/functions";

export default async (req: Request, context: Context) => {
  const { city, country } = context.params;

  return new Response(`You're visiting ${city} in ${country}!`);
};

export const config: Config = {
  path: "/travel-guide/:city/:country"
};

You can choose to run a function on one or more URL paths. To configure multiple paths, set the path property as an array.

import { Config } from "@netlify/functions";

export const config: Config = {
  path: ["/cats", "/dogs"]
};

You can leverage the URLPattern syntax from the web platform to define wildcards and named groups, which are matched against the incoming request URL and exposed to the function in the context.params object.

import { Config } from "@netlify/functions";

export const config: Config = {
  path: ["/sale/*", "/item/:sku"]
};

When needed, use excludedPath as an optional URLPattern exclusion to limit the routes matched by path. Must also start with /, for example excludedPath = "/*.css". Accepts a single string or an array of strings.

import { Config } from "@netlify/functions";

export const config: Config = {
  path: "/product/*",
  excludedPath: ["/product/*.css", "/product/*.js"]
}

By default, a function runs for any requests to its configured paths regardless of whether or not static assets exist on those paths. To prevent the function from shadowing files on the CDN, set preferStatic to true.

import { Config } from "@netlify/functions";

export const config: Config = {
  path: ["/product/:sku", "/item/:sku"],
  preferStatic: true
};

# Environment variables

Netlify Functions have access to environment variables in the runtime environment via the Netlify.env global object.

import { Context } from "@netlify/functions";

export default async (req: Request, context: Context) => {
  const requestKey = req.headers.get("X-API-Key");
  const apiKey = Netlify.env.get("MY_API_KEY");

  if (requestKey === apiKey) {
    return new Response("Welcome!");
  }

  return new Response("Sorry, no access for you.", { status: 401 });
};

If you have the option to set specific scopes for your environment variables, the scope must include Functions to be available to functions during runtime.

You can also leverage build environment variables to configure how Netlify builds your functions. For example, you can use an environment variable to set the Node.js version.

Learn more about how to set and use environment variables with functions.

# Runtime

Netlify Functions run in Node.js, using the version configured for your site. Node.js version 18.0.0 is the minimum version required because functions use the standard Fetch API, which was only added natively to Node.js in version 18.0.0.

# Module format

Node.js supports two distinct module formats with different capabilities and APIs: ECMAScript modules (or ES modules), an official standard format for JavaScript packages, and CommonJS, a legacy format specific to Node.js.

The module format for each function will be determined by the file extension of its entry file:

  • Functions with the .mts extension are always executed as ES modules
  • Functions with the .cts extension are always executed as CommonJS
  • Functions with the .ts extension are executed as ES modules if the closest package.json file has a type property with the value module; otherwise they are executed as CommonJS

Choosing a module format has implications on how you write your function, especially when it comes to importing npm packages:

  • CommonJS functions cannot use a static import to load npm packages written as ES modules and must use a dynamic import
  • ES modules functions cannot use named imports (for example, import { kebabCase } from "lodash") when referencing npm packages written in CommonJS, and should instead use a default import (for example, import _ from "lodash")
  • In ES modules, Node.js built-in primitives like __dirname and __filename are not available and should be replaced with import.meta.url

Prefer ES modules

Unless you have strong reasons to prefer CommonJS, we recommend that you choose ES modules, since it’s a modern, standard, forward-looking format. Since it’s also used in Edge Functions, using this format will make your code interoperable between the two function types.

# Lambda compatibility

Netlify Functions support an alternative API surface that is compatible with AWS Lambda. This may be useful if you’re looking to migrate Lambda workflows into Netlify with minimal refactoring required.

To opt-in, your handler function must be exported using a handler named export.

import type { Handler } from "@netlify/functions";

export const handler: Handler = async (event, context) => {
  return {
    body: JSON.stringify({ message: "Hello World" }),
    statusCode: 200,
  }
}

For more information about this API, refer to Lambda compatibility.

# Test locally

To streamline writing and testing your functions on Netlify, run a local development environment with Netlify Dev. This feature of Netlify CLI includes tools for local function development through a simulated Netlify production environment. The netlify dev command starts a framework server if a framework is detected and handles redirects, proxy rules, environment variables, and Netlify Functions.

By default, the geo location used is the location of your local environment. To override this to a default mock location of San Francisco, CA, USA, use the --geo=mock flag. To mock a specific country, use --geo=mock --country= with a two-letter country code. For more information about the --geo flag, visit the CLI docs.

# Next steps

Push your function source files to your Git provider for continuous deployment where Netlify’s build system automatically detects, builds, and deploys your functions. For more control over the process, learn about other workflows for deploying your functions including custom builds with continuous deployment and manual deploys with the Netlify CLI or API.

Monitor function logs and metrics in the Netlify UI to observe and help troubleshoot your deployed functions.

Netlify function logs are found in the Netlify UI. You can also stream Netlify function logs to the console with the Netlify CLI.