Kirill
Kirill

Reputation: 3737

Cloud Functions for Firebase killed due to memory limit exceeded

I keep getting a sporadic error from Cloud Functions for Firebase when converting a relatively small image (2mb). When successful, the function only takes about 2000ms or less to finish, and according to Image Magick documentation should I should not see any problems.

I tried increasing the buffer size for the command, which isn't allows from within Firebase, and I tried to find alternatives to .spawn() as that could be overloaded with garbage and slow things down. Nothing works.

Upvotes: 62

Views: 47271

Answers (10)

Michael Giovanni Pumo
Michael Giovanni Pumo

Reputation: 14774

You can set this from within your Cloud Function file on Firebase.

const runtimeOpts = {
  timeoutSeconds: 300,
  memory: '1GB'
}

exports.myStorageFunction = functions
  .runWith(runtimeOpts)
  .storage
  .object()
  .onFinalize((object) = > {
    // do some complicated things that take a lot of memory and time
  });

Taken from the docs here: https://firebase.google.com/docs/functions/manage-functions#set_timeout_and_memory_allocation

Don't forget to then run firebase deploy from your terminal.

Upvotes: 74

GorvGoyl
GorvGoyl

Reputation: 49150

Figuring out from UI is a bit tricky so here are some guided screenshots:
Go to url https://console.cloud.google.com/functions/list

enter image description here

enter image description here

enter image description here


You can also increase default timeout of 60 sec

enter image description here

enter image description here

Upvotes: 9

Siddhant
Siddhant

Reputation: 825

you can add the configurations in your firebase functions definitions something like:

functions.runWith({memory: '2GB', timeoutSeconds: '360'})

Upvotes: 5

ravo10
ravo10

Reputation: 963

You can adjust your memory here:

enter image description here

Upvotes: 11

p3sn
p3sn

Reputation: 1072

I was lost in the UI, couldn't find any option to change the memory, but finally found it:

  1. Go to the Google Cloud Platform Console (not the Firebase console)
  2. Select Cloud Functions in the menu
  3. Now you see your firebase function in here if it's correct. Otherwise check if you selected the right project.
  4. Ignore all checkboxes, buttons and menu items, just click on the name of the function.
  5. Click on edit (top menu) and only change the allocated memory and click save.

Upvotes: 68

Fan Kam Thong
Fan Kam Thong

Reputation: 171

The latest firebase deploy command does overwrite the memory allocation to default 256MB and timeout up to 60s.

Alternatively , to specify the desired memory allocation and maximum timeout , I use gcloud command such as:

gcloud beta functions deploy YourFunctionName --memory=2048MB --timeout=540s

Other options, please refer to:

https://cloud.google.com/sdk/gcloud/reference/beta/functions/deploy

Upvotes: 13

Kiana
Kiana

Reputation: 1506

Another option here would be to avoid using .spawn() altogether.

There is a great image processing package for node called Sharp that uses the low-memory footprint library libvips. You can check out the Cloud Function sample on Github.

Alternately, there is a Node wrapper for ImageMagick (and GraphicsMagick) called gm. It even supports the -limit option to report your resource limitations to IM.

Upvotes: 0

Shai Ben-Tovim
Shai Ben-Tovim

Reputation: 932

It seems the default ImageMagick resource config in Firebase Cloud Functions doesn't match the actual memory allocated to the function.

Running identify -list resource in the context of a Firebase Cloud Function yields:

File       Area         Memory        Map       Disk   Thread  Throttle       Time
--------------------------------------------------------------------------------
 18750    4.295GB       2GiB       4GiB  unlimited        8         0   unlimited  

The default memory allocated to a FCF is 256MB - the default ImageMagick instance thinks it has 2GB and therefore doesn't allocate buffer from disk and can easily try to over allocate memory causing the function to fail on Error: memory limit exceeded. Function killed.

One way is to increase required memory as suggested above - although there's still risk IM will try to over allocate depending on your use case and outliers.

Safer yet would be to set the correct memory limit to IM as part of the image manipulation process using -limit memory [your limit]. You can figure out your approx memory usage by running your IM logic with `-debug Cache' - it will show you all the buffers allocated, their sizes and if they were memory or disk.

If IM hits the memory limit it will start allocating buffers on disk (memory mapped and then regular disk buffers.You'll have to consider your specific balance between I/O performance vs memory cost. Price of every additional byte of memory you allocate to your FCF is multiplied by 100ms of usage - so that can grow quickly.

Upvotes: 1

Kirill
Kirill

Reputation: 3737

[update] As one commenter suggested, this should no longer be an issue, as firebase functions now maintain their settings on re-deploy. Thanks firebase!

Turns out, and this is not obvious or documented, you can increase the memory allocation to your functions in the Google Functions Console. You can also increase the timeout for long-running functions. It solved the problem with memory overload and everything is working great now.

Edit: Note that Firebase will reset your default values on deploy, so you should remember to login to the console and update them right away. I am still looking around for a way to update these settings via CLI, will update when I find it.

Upvotes: 23

ovaris
ovaris

Reputation: 301

Update: It looks that they now preserve settings on re-deploy so you can safely change memory allocation in cloud console!

Upvotes: 5

Related Questions