Arun
Arun

Reputation: 1220

Bash sending multiple curl request using GNU parallel

I have a list of URLs ( 5000+ ) and I need to send 25 URLs in parallel to a internal service. I know how to send the URLs request using a single query.

curl -s http://192.168.150.113:9999/app.boxx.com 

And I tried using GNU parallel,

while true;do parallel -j25 curl -s http://192.168.150.101:9999/'{}' < list;done

Is it good to use GNU parllel ? It works good but i feel the response is quite slow and the response is similar to a single API request.

Instead , Can we use ampersand ( & ) at end of each urls and send the request in parallel ?

Upvotes: 2

Views: 3444

Answers (2)

Ole Tange
Ole Tange

Reputation: 33685

Inians answer is perfectly valid and is the preferred if you need to do something more complex. But if you are only going to do a single curl you can do:

parallel -j25 curl -s http://192.168.150.101:9999/{} < list

Upvotes: 4

Inian
Inian

Reputation: 85580

I'm not sure if you are using the full potential of GNU parallel to the extent to what is should be used. For it to work, you need to do define a smaller job (the least smallest unit that you can breakdown) and let it run for the the number of times you want.

Define a function to read from the URL, assuming the part http://192.168.150.113:9999/ is a fixed string and rest of the URL comes from a file, define a function as

oneShot() {
    url="http://192.168.150.113:9999/"
    finalURL="$url$1"
    curl -s "$finalURL"   
}

and export this function to make it available across child-shells

export -f oneShot

and now do the magic to achieve parallelism, to run 25 jobs in parallel

parallel -j25 oneShot < list

Upvotes: 3

Related Questions