ClementDevos
ClementDevos

Reputation: 63

Deploy Firebase Functions on node 14 runtime with increased Memory

We're currently using Firebase Functions with the node 14 public preview.

We need to increase the memory of a function past 1Gb.

The documentation of Google Cloud functions specifies that the max_old_space_size must be set for newest runtime, and the documentation shows :

gcloud functions deploy envVarMemory \
--runtime nodejs12 \
--set-env-vars NODE_OPTIONS="--max_old_space_size=8Gi" \
--memory 8Gi \
--trigger-http

However, the set-env-vars options does not exist in the firebase deploy

Using firebase deploy --only functions:myFunction --set-env-vars NODE_OPTIONS="--max_old_space_size=4Gi" Yields the error: unknown option '--set-env-vars' error.

While deploying a heavy function, i logically get a heap out of memory error :

[1:0x29c51e07b7a0]   120101 ms: Mark-sweep (reduce) 1017.1 (1028.5) -> 1016.2 (1028.7) MB, 928.7 / 0.1 ms  (average mu = 0.207, current mu = 0.209) allocation failure scavenge might not succeed 
[1:0x29c51e07b7a0]   119169 ms: Scavenge (reduce) 1016.9 (1025.2) -> 1016.2 (1026.5) MB, 3.6 / 0.0 ms  (average mu = 0.205, current mu = 0.191) allocation failure  

And we can see the function only has 1028Mb of ram, not 4.

We did ask it to deploy with 4Gb:

functions
    .runWith({ memory: '4GB', timeoutSeconds: 300 ,})

What is the key here?

Upvotes: 6

Views: 1993

Answers (1)

Vincent Cotro
Vincent Cotro

Reputation: 314

We had exactly the same issue. It seems to happen when deploying function with node 12 or more.

Here is the solution to solve this:

  1. Find your function on GCP web app
  2. Click on "edit"
  3. Scroll down and find "Runtime environment variables"
  4. Add key NODE_OPTIONS with value : --max_old_space_size=4096

Here is a picture of the setting :

enter image description here

This is really annoying and I did not find any solution to set this setting while deploying in command line.

Upvotes: 12

Related Questions