Jasmine
Jasmine

Reputation: 476

Memory issues in a Cloud Storage Function

I have deployed a storage trigger cloud function that needs more memory. While deploying the GCF, I have deployed in the following manner with the appropriate flags.

gcloud functions deploy GCF_name--runtime python37 --trigger-resource bucket_name --trigger-event google.storage.object.finalize --timeout 540s --memory 8192MB

But I observed in the google cloud console, the memory utilization map is not going beyond 2GB. And in the logs I am getting this error, Function execution took 34566 ms, finished with status: 'connection error' which happens because of memory shortage. Can I get some help on this.

Image Utilization graph

Edited

The application uploads text files to the storage that contains certain number of samples. Each file is read when it is uploaded to the storage and the data appended to a pre existing file. The total number of samples will be maximum of 75600002. That's why I need 8GB data. Its giving the connection error while appending the data to the file.

def write_to_file(filename,data,write_meta = False,metadata = []):
    file1 = open('/tmp/'+ filename,"a+")
    if write_meta:
        file1.write(":".join(metadata))
        file1.write('\n')
    file1.write(",".join(data.astype(str)))
    file1.close()

The memory utilisation map was the same after every upload.

Upvotes: 0

Views: 277

Answers (1)

MBHA Phoenix
MBHA Phoenix

Reputation: 2227

You are writing a file to /tmp which is an in-memory filesystem. So start by deleting that file when you finish uploading it. In fact those:

Files that you write consume memory available to your function, and sometimes persist between invocations. Failing to explicitly delete these files may eventually lead to an out-of-memory error and a subsequent cold start.

Ref : https://cloud.google.com/functions/docs/bestpractices/tips#always_delete_temporary_files

Upvotes: 2

Related Questions