Reputation: 39
Because of the nature of the script (done at work, on a work RHEL machine) I cannot show the code, but I can at least provide pseudocode to help with a starting point. Currently:
start loop
1) read in the first line of a host text file (then the next and such per the loop) of a file and assign it to a variable (host name)
2) send ssh -t command to the host (which takes anywhere between 2 to 6 minutes to receive a response back)
3) log response to a text file (repeat loop with new host from read in text file)
end loop
Currently I have to run this script over night because of how many systems this script hits.
I want to be able to achieve the same goal and get the response from the command in that file per host, but I want the command to be sent out at the same time so that it takes anywhere between 2 to 6 minutes all together. But because this is for work, I am not allowed to install ansible on the system; would there be another way to achieve this goal? If so please provide some areas or point me in the right direction.
Upvotes: 0
Views: 246
Reputation: 33685
With GNU Parallel:
parallel -j0 --slf hosts.txt --nonall mycommand > out.txt
But maybe you want a bit more info:
parallel -j0 --slf hosts.txt --joblog my.log --tag --nonall mycommand > out.txt
Upvotes: 1
Reputation: 2263
I did this using sh
years ago using something like:
while true
do
if [ numberOfFileinSomeDir -lt N ]
then
(touch SomeDir/hostname; ssh hostname ... > someotherDir/hostname.txt ; rm SomeDir/hostname) &
...
But this stops working after ~100 hosts. It sucks - don't do it. If less than about ~500 hosts pssh may be the easiest - maybe you can install in your home directory?
Google something like python parallel execute process multiple
and someone's bound to have a script that will do what you need already.
More than ~500 hosts and you really need to start installing some tools as others have mentioned in the comments.
Upvotes: 0