Reputation: 3655
I am writing a PHP script which downloads pictures from Internet. As the data is huge the execution time for the script varies from 10-15 minutes. Are there any better ways to handle such a situation or should i simple execute the script and let it take the time it takes?
Upvotes: 1
Views: 86
Reputation: 1294
If optimisation is worth investing time and if substantial part of execution time is consumed on image processing then calling a shell script that spins a few processes might be an option
Upvotes: 0
Reputation: 1295
I would've recommended multiple threads to do it faster if there are not any bandwidth restrictions. but the closest thing php has is process control.
Alternatively sometime ago I wrote a similar scraper, and to execute it faster I used the exec functions to instantiate multiple threads of the same file. Hence you also need to create a repository and locking mechanism. Sounds and looks dirty, but works!
Upvotes: 1
Reputation: 57388
Your script appears to be essentially I/O bound. Short of getting more bandwidth, there's little you can do.
You can improve user experience (if any) by increasing interactivity. For example you can save the filenames you intend to download in a session, and redisplay the page (and refresh it, or go AJAX) after each one, showing expected completion time, current speed, and percentage of completion.
Basically, the script will save in session the array of URLs, and at each iteration pop some of them and download them, maybe checking the time it takes (if you download one file in half a second, it's worth it to download another).
Since the script is executed several times, not only one, you need not worry about its timeout. You do, however, have to deal with the possibility of the user aborting the whole process.
Upvotes: 1