Reputation: 8593
I have a php script which queries a list of clients from a mysql database, and goes to each client's IP address and picks up some information which is then displayed on the webpage.
But, it takes a long time, if the number of clients is too high. Is there anyway, I can send those url requests (file_get_contents) in parallel?
Upvotes: 2
Views: 391
Reputation: 21563
I would use something like Gearman to assign them as jobs in a queue for workers to come along and complete if this needs to scale.
As another option I have also written a PHP wrapper for the Unix at queue, which might be a fit for this problem. It would allow you to schedule the requests so that they can run in parallel. I have used this method successfully in the past to handle the sending of bulk email, which has similar blocking problems to your script.
Upvotes: 1
Reputation:
Lineke Kerckhoffs-Willems wrote a good article about Multithreading in PHP with CURL. You can use that instead of file_get_contents()
to get needed information.
Upvotes: 2