John
John

Reputation: 81

PHP Laravel Uploading File to AWS S3 Asyncronous

I am implementing upload file using AWS S3. the file that I want to upload is average 500 Mb. The process of uploading (using filesystem AWS S3) is synchronous. So when one user uploads big file, others people cannot access the website until the user finished uploading progress. How to make it asynchronous?

Basically, I have two issues:

  1. Uploading large files in chunks so that other people can use the website
  2. Uploading it asynchronously.

the command I used to handle upload is:

Storage::put('preview_image/'.$file_name, $file_preview_image_1, 'public');

Upvotes: 3

Views: 3929

Answers (5)

John
John

Reputation: 81

SOLVED

After being stuck and seeking for a good solution for several days, I finally got an explanation about why my project ran single-threaded, it's because I ran php artisan serve. I reported issue on https://github.com/laravel/framework/issues/22944.

Upvotes: 1

Sapnesh Naik
Sapnesh Naik

Reputation: 11636

If you are trying to upload a file from user's disk to a remote location then It's not possible to do what you want (i.e queuing it to be done later).

The user needs to complete the file upload in the $POST request of the form - you cant queue it to be done later. Queueing is for server-side processing tasks to be delayed and done later - but uploading needs the user to stay on the page to send the data to your server.

To further expand - the best option you will be able to do is a javascript asynchronous upload - using a package like dropzonejs or something. This way users can upload multiple files simultaneously, and get visual progression bars updating.

Upvotes: 0

Mysteryos
Mysteryos

Reputation: 5791

The rule of thumb when dealing with jobs that take an excessive amount of time (above 5 seconds) to complete, is to process them in the background.

See: https://laravel.com/docs/5.5/queues

So when one user upload big file, others people cannot access the website until the user finished uploading progress

However yours is a hosting issue whereby your upload bandwidth is being consumed completely by the upload OR the php script which is running the upload process is consuming too much memory and therefore blocks other php threads from spawning.

Upvotes: 0

Shobi
Shobi

Reputation: 11451

i don't understand that why other users can't access the website, because modern servers are capable of handling multiple requests very well maybe you have to consider upgrading your server.

But to have asynchronous behaviour in your project you can look into laravel Jobs and Queues, Here is the doc link, But you have to consider changing your queue driver appropriately, by default laravel queues driver is sync which is nothing but synchronous, once you have your queues setup it can be redis queues or amazon sqs or anything, you can push the file uploadng job to a queue and you can take the user out from the hassle of waiting for the file to upload, there are custom packages like laravel horizon for monitoring your queues. where you can even restart the job if it fails.

Upvotes: 0

yakobom
yakobom

Reputation: 2711

Why don't you use Asynchronous Multipart Uploads, which is recommended for files larger than 100MB? The code will look something like this:

$source = '/path/to/large/file.zip';
$uploader = new MultipartUploader($s3Client, $source, [
    'bucket' => 'your-bucket',
    'key'    => 'my-file.zip',
]);

$promise = $uploader->promise();

You can look at the documentation here: Asynchronous multipart uploads

Upvotes: 1

Related Questions