Reputation: 339
Is it possible to send an http-get request to many hosts at same time with linux bash tools?
At the moment I do
wget -O- http://192.168.1.20/get_data-php > out.log
But I need to request all 192.168.1.0/17
IPs.
Upvotes: 2
Views: 436
Reputation: 33685
Based on Drejc's answer, but avoids messing with temporary files and deals better with a process limit that is lower than the number of hosts (e.g. if you have 1000s of hosts).
#!/bin/sh
nmap -T5 -n -sn 192.168.1.0/17 -oG - |
awk '/Up$/{print $2}' |
parallel -j0 wget -q -O- http://{}/get_data-php > allout.txt
Upvotes: 0
Reputation: 586
The simplest way to do this is to use bash brace expansion:
wget -O- http://192.168.{0..127}.{1..254}/get_data-php >>out.log
... if performance is not a concern (because it will run the requests sequentially).
Of course there are ways to run requests in parallel, but I guess that that is out of scope for this question.
Upvotes: 2
Reputation: 531
#!/bin/sh
rm address.txt allout.txt # remove old file with addresses and contents
nmap -n -sn 192.168.1.0/17 -oG - | awk '/Up$/{print $2}' > address.txt # get all active hosts and store into a file address.txt
while IFS="" read -r add || [ -n "$add" ]
do
wget -q -O- http://"$add"/get_data-php > out"$add".log & # for every address create file with wget content
done < address.txt
wait
cat out*.log > allout.txt # put all .log file contents to allout.txt
rm -r out*.log # remove all created .log files
Upvotes: 2