Reputation: 75127
I am running a bash script as like that:
for i in {0..3250000..50000}
do
wget "http://xxx/select?q=*:*&row_size=50000&start=$i" -O $i.csv
done
Every time when I send a request I have to wait to finish it and write to a file and after that it continues to looping. However I want to do it asynchronously. I mean that it will send a request and loop without waiting response. However when a response comes it will do the proper thing.
How can I do that?
Upvotes: 3
Views: 254
Reputation: 879
You can use xargs
:
printf '%s\0' {0..50000..3250000} |
xargs -0 -I {} -n 1 -P 20 \
wget 'http://xxx/select?q=*:*&row_size=50000&start={}' -O {}.csv
The -0
selects the NULL character as delimiter, -I {}
replaces {}
with
the argument, -n 1
hands over a single argument to wget
and -P 20
processes 20 requests at a time, in parallel.
Alternatively you can append a &
to your command line to execute it in background and wait
for the processes to finish.
Upvotes: 3
Reputation: 1621
You can execute it with the following. &
executes command in background.
<cmd> &
But your loop looks huge and it'll be running in background. The control will be returned to the script. so to avoid execution issues you should write some function to check whether the background operation has exited or not.
Upvotes: 1