Reputation: 197
I got a file which contains thousands of curl post commands. I need to run them in batches of 100.
Input file contains all the commands.
curl -v -XPOST -H "Content-type:application/json" -d '{"k1":"v1","k2":"v2"}' http://localhost:8080/v1/SomeEndPoint
curl -v -XPOST -H "Content-type:application/json" -d '{"k3":"v3","k4":"v4"}' http://localhost:8080/v1/SomeEndPoint
curl -v -XPOST -H "Content-type:application/json" -d '{"k5":"v5","k6":"v6"}' http://localhost:8080/v1/SomeEndPoint
I tried below command, even with -n1 argument its not considering whole line as an argument instead splitting it on spaces or talking whole file as input instead of single line.
xargs -n1 -P100 -p bash -c < inputFile
Output
bash -c curl?...
bash -c -v?...
bash -c -XPOST?...
bash -c -H?...
bash -c Content-type:application/json?...
xargs -n1 -0 -P100 -p bash -c < inputFile
Output
bash -c curl -v -XPOST -H "Content-type:application/json" -d '{"k1":"v1","k2":"v2"}' http://localhost:8080/v1/SomeEndPoint
curl -v -XPOST -H "Content-type:application/json" -d '{"k1":"v1","k2":"v2"}' http://localhost:8080/v1/SomeEndPoint?...
How can i run it for each line of input file?
Upvotes: 1
Views: 547
Reputation: 8755
xargs
xargs -d '\n' -n1 -P100 sh -c < input.txt
xargs
tr '\n' '\0' < input.txt | xargs -0 -P100 -n1 sh -c
xargs
?I would suggest use your favorite programming language to debug. Mine is Ruby, for example:
ruby -e 'p ARGV'
or NodeJS:
node -e 'console.log(process.argv)'
Put the above commands between xargs
and the command you want to execute, for example:
xargs -n1 -P100 ruby -e 'p ARGV' bash -c < input.txt
["bash", "-c", "-v"]
["bash", "-c", "curl"]
["bash", "-c", "-XPOST"]
["bash", "-c", "-H"]
["bash", "-c", "https://httpbin.org/post"]
["bash", "-c", "-v"]
["bash", "-c", "-d"]
["bash", "-c", "Content-type:application/json"]
["bash", "-c", "curl"]
["bash", "-c", "{\"k1\":\"v1\",\"k2\":\"v2\"}"]
["bash", "-c", "{\"k3\":\"v3\",\"k4\":\"v4\"}"]
["bash", "-c", "-H"]
["bash", "-c", "Content-type:application/json"]
["bash", "-c", "-XPOST"]
["bash", "-c", "-d"]
["bash", "-c", "https://httpbin.org/post"]
["bash", "-c", "curl"]
["bash", "-c", "-v"]
["bash", "-c", "-XPOST"]
["bash", "-c", "-H"]
["bash", "-c", "Content-type:application/json"]
["bash", "-c", "-d"]
["bash", "-c", "{\"k5\":\"v5\",\"k6\":\"v6\"}"]
["bash", "-c", "https://httpbin.org/post"]z
xargs -d '\n' -n1 -P100 ruby -e 'pp ARGV' sh -c < input.txt
["sh",
"-c",
"curl -v -XPOST -H \"Content-type:application/json\" -d '{\"k1\":\"v1\",\"k2\":\"v2\"}' http://localhost:8080/v1/SomeEndPoint"]
["sh",
"-c",
"curl -v -XPOST -H \"Content-type:application/json\" -d '{\"k3\":\"v3\",\"k4\":\"v4\"}' http://localhost:8080/v1/SomeEndPoint"]
["sh",
"-c",
"curl -v -XPOST -H \"Content-type:application/json\" -d '{\"k5\":\"v5\",\"k6\":\"v6\"}' http://localhost:8080/v1/SomeEndPoint"]
Upvotes: 3
Reputation: 2916
Certainly not the most elegant solution, but it could work:
input_file=test.sh
batch_size=100
nlines=$(wc -l $input_file | cut -f 1 -d ' ')
for i in $(seq 1 $batch_size $nlines); do
for j in $(seq $i $((i+batch_size))); do
line="$(sed -n ${j}p $input_file)"
eval "$line" &
done
wait
done
It takes batches of n lines and executes them individually in the background with eval
. After each batch, it waits for each background process to terminate (wait
).
Upvotes: 0
Reputation: 303
If each line is a fully constructed command, xargs
is the wrong tool.
Both of these will work:
sed -n 1,100p inputFile | while read line; do eval "$line"; done
sed -n 1,100p inputFile | while read line; do echo "$line" | sh; done
Upvotes: 0