Reputation: 43
I'm implementing a serverless project on google cloud. Users will upload 4GB sized zip files on a cloud storage bucket. (Users compress files on their own before uploading) They need to be uncompressed before the contents can be processed.
I find some solutions for small files:
Here, the file downloaded by function is stored in a memory space allocated to the function. However, the maximum memory for cloud functions is 2GB which is too small for me.
In the worst case, I would need to use VMs but that would be expensive.
Are there any other ways around? Preferred language is python.
Upvotes: 1
Views: 2110
Reputation: 317740
A solution for node would look something like this:
You will likely need to understand node streams well in order to make this happen.
Since this is all happening by piping streams (and not reading everything into memory at once), it should work with using minimal memory.
Upvotes: 7