Mikhail Ramendik
Mikhail Ramendik

Reputation: 1115

Handling a big file upload by chunks in CGI or similar, ideally Python

So I want to make a minimalistic personal-use website that, among other things, allows someone (with credentials or a single-use URL) to upload a large file for me. Downloading this file is not to be implemented, I'll download it outside the web UI. I want to show the user a progress indicator (just in text, not bothering with design).

Ideally I want to push the file straight into S3 storage (backblaze B2). I did find one existing solution that can use S3 storage, Gokapi, but it only allows the admin to upload. (Not sure I see the point, the admin could just as well use the s3 command line or tools like s3browser?). There also seems to be a possibility to use S3 storage in Nextcloud but I am not convinced I want to install a comprehensive "box of everything" like Nextcloud.

So, it seems like I want to code something simple like a CGI script or similar. However: all CGI examples for file upload that I see are just called by the web server (Apache or nginx) after the file is uploaded. What if the file is bigger than the storage I have on the VPS? Also this does not seem to open any possibility to show progress to the uploader.

How can I go about handling an upload chunk by chunk, pushing it into s3 storage and displaying progress updates, in the most minimalistic way possible?

Upvotes: 0

Views: 31

Answers (0)

Related Questions