user816604
user816604

Reputation:

curl for over 100K requests

I am using curl_multi_exec to process over 100K requests. I do 100 requests at a time because curl_multi_exec can only handle 10- requests at a time to eventually get to 100K requests. We've added multiple servers to this system to spread the load [we are using load balancing]. What is the best way to have curl handle 100K requests and make use of these additional servers? What's the downside (other than time) of handling that many requests on one server? How can I use the additional servers to help handle those requests?

To elaborate- basically, we are using curl to send out over 100K requests to third party servers. The problem with only using 1 server is that there is a memory limit in the number of requests 1 server can handle. So we decided to add additional servers, but we are not sure how to design this system to use curl to handle that many requests..

Thanks!

Upvotes: 0

Views: 505

Answers (1)

Marc B
Marc B

Reputation: 360572

Don't obsess about CURL. It's simply a tool. You're focusing on the wrong level of design.

What you need to consider is how to spread this workload amongst multiple servers. A simple design would have one central database listing all your urls, and a method for clients to "check out" a url (or set of urls) to chug away on.

Once you've got that working, the curl portion will be the easiest part of it all.

Upvotes: 2

Related Questions