Reputation:
I've made a simple for loop to make POST requests using curl and save the output to a .txt file.
for ((i=200000; i<=300000; i++)); do
curl -s -X POST -d "do=something&page=$i" "https://example.com/ajax" -o "$i.txt" > /dev/null
done
Currently, the script creates a new output in like every 260 ms. Is it possible to make this process even faster?
Upvotes: 1
Views: 1706
Reputation: 76
Have a look at gnu parallel. You can use this to get parallelisation for anything, but it also works well with curl. Look to replace for and while loops with it and test for optimal performance as more is not always better and there is diminishing marginal return as you go beyond a certain point.
Here is a reference to another article that discusses it: Bash sending multiple curl request using GNU parallel
I wanted to add a simple example to my previous post.
parallel -j8 curl -s '{}' < urls >/dev/null
-j8 means to use 8 parallel processes, but this can be left unset and it will try and use as many as possible. 'urls' is a text file with a bunch of URLs.
Change and apply as you see fit as it doesn't conform specifically to your example above.
Upvotes: 1