Reputation: 63
I have a DICOM viewer application that allows users to upload DICOM studies (500MB - 3GB in size). Each study could contain 200-2000 individual DICOM files. I allow users to directly upload these DICOM studies to a Google Cloud Storage bucket that is publicly writable. After a study is fully uploaded to the bucket, the frontend application sends a request to a cloud function to process the uploaded files.
There are 4 parts to processing the files:
The issue that I am having is that it is taking too long to run all these processing steps after the full study is uploaded. One solution I was thinking of was to invoke a cloud function per each individual DICOM file as it gets uploaded to the bucket and run #1 and #2 on it and then wait for the study to fully upload before running #3 and #4. The concern I have with this approach is that since my bucket is publicly writable, any malicious user could upload a very large amount of files and invoke many cloud functions which will result in unnecessary charges.
Another approach is to only allow authenticated users to upload files to a private GCS bucket, but that would require me to generate a signed URL per each DICOM file. So if there are 2000 DICOM files, I would need the front-end app to request to create 2000 signed URLs from the backend.
I am not sure how to approach this issue. Any advice in designing or implementing will be helpful
Upvotes: 0
Views: 339