Reputation: 99
I have am working on building a service where I want to allow people to upload large amounts of high quality video and photo files to Google Cloud Storage. I am using FFMPEG for video conversion to smaller sizes and am experiencing a lot of troubles with functions crashing.
Am I correct in my understanding that Cloud functions are built to handle lightcomputational task? could I get better performance by having cloud a function trigger a compute engine function for video processing
What's the best way to structure a flow like this:
1: User uploads 1-50 video files.
2: Generate smaller Copy's of video files for quick mobile and web browsing?
My current structure is: 1: upload file to storage bucket 2: cloud function trigger and computation. 3: Cloud function writes thumb video to storage bucket
Can I get better results from doing something like: 1: upload file to storage bucket 2: cloud function trigger 3: Cloud function triggers Google Compute Engine 4: Compute Engine reduces file sizes 5: compute engine writes thumb video to Google Storage
Any pointers or leads on how to do this would be great
Upvotes: 0
Views: 206
Reputation: 317352
Cloud Functions instances don't have a very large amount of resources to work with. They work well for smaller amounts of work. It's actually impossible to have a function run for longer than 9 minutes, so that's the hard limit no matter how much processing power they get. For larger amounts of work that might run longer, you are almost certainly better off using Compute Engine, but you really need to benchmark that yourself for you expected use case to know for sure.
Upvotes: 1