Reputation: 677
Ok, let's say my website allows users to upload large batches of files for processing.
Now, once they upload the files say a process_files() function is called. This will take a long time to complete and it will basically make the website unusable for the user until the task either times out or completes.
What I'm wondering is if there's an easy to to just execute this in parallel or in the background so the user can still use the website?
If it makes a different in my case these will very seldom be called.
Upvotes: 2
Views: 335
Reputation:
i would suggest that after the file upload, you call a script via the CLI, which will run in the background and be non blocking:
exec("php the_long_file_processing_script.php > /dev/null &");
Upvotes: 0
Reputation: 2560
you can divide task to two parts:
first part (web) only saves a file on a disk. this way web part will work fast.
second (system) will process all files saved by first part. and you can start it periodically ie as a cron task on linux, or use some other task scheduler depending on your system.
Upvotes: 4