Highway of Life
Highway of Life

Reputation: 24311

How to run multiple perl open commands asynchronously and display output in order

I'm attempting to run multiple SSH commands to multiple servers asynchronously, I'd like to capture the output from the commands and display them in order. To throw in an additional curveball, I'd like pid3 to only run once pid2 is complete, and pid4 to run after the first three commands complete. How would this best be accomplished?

Example:

// $pid1 and $pid2 should run asynchronously
my $pid1 = open(my $SSH1, "|ssh -t -t runuser\@$server{'app'} 'sudo chef-client'");

my $pid2 = open(my $SSH2, "|ssh -t -t runuser\@$server{'auth'} 'sudo chef-client'");

// This command should wait for $pid2 to complete.
my $pid3 = open(my $SSH3, "|ssh -t -t runuser\@$server{'auth'} \"sudo -- sh -c '$update_commands'\"");

// This command should wait for $pid1-3 to complete before running.
my $pid4 = open(my $SSH4, "|ssh -t -t runuser\@$server{'varn'} \"sudo -- sh -c '$varn_commands'\"");

Upvotes: 2

Views: 939

Answers (3)

salva
salva

Reputation: 10234

Use Net::OpenSSH::Parallel:

# untested!
use Net::OpenSSH::Parallel;
my $pssh = Net::OpenSSH::Parallel->new;

$pssh->add_server(app  => $server{app},  user => 'runuser');
$pssh->add_server(auth => $server{auth}, user => 'runuser');
$pssh->add_server(varn => $server{varn}, user => 'runuser');

$pssh->push('app',  cmd  => @cmd1);
$pssh->push('auth', cmd  => @cmd2);
$pssh->push('auth', cmd  => @cmd3);
$pssh->push('varn', join => '*');
$pssh->push('varn', cmd  => @cmd4);

$pssh->run;

Automating sudo is slighly more complex if you need to pass the passwords, but still can be done. It is explained on the module docs.

Upvotes: 2

mob
mob

Reputation: 118595

Forks::Super handles all of these requirements:

use Forks::Super;

# run  $command1  and  $command2 , make stderr available
my $job1 = fork { cmd => $command1, child_fh => 'err' };
my $job2 = fork { cmd => $command2, child_fh => 'err' };

# job 3 must wait for job 2. Collect stdout, stderr
my $job3 = fork { cmd => $command3, depend_on => $job2, child_fh => 'out,err' };

# and job 4 waits for the other 3 jobs
my $job4 = fork { cmd => $command4, depend_on => [ $job1, $job2, $job3 ],
                  child_fh => 'out,err' };

# wait for jobs to finish, then we'll collect output
$_->wait for $job1, $job2, $job3, $job4;
my @output1 = $job1->read_stderr;
my @output2 = $job2->read_stderr;
my @output3 = ($job3->read_stdout, $job3->read_stderr);
my @output4 = ($job4->read_stdout, $job4->read_stderr);
...

Upvotes: 2

Highway of Life
Highway of Life

Reputation: 24311

My (somewhat crude) solution thus far. I feel there may be a more elegant way to handle this in Perl, but this may get the job done:

# Silence all non-error output from the commands on first 2 servers:
my $pid1 = open(my $SSH1, "|ssh -t -t runuser\@$server{'app'} 'sudo chef-client > /dev/null'");

my $pid2 = open(my $SSH2, "|ssh -t -t runuser\@$server{'auth'} 'sudo chef-client > /dev/null'");

if ($pid1) {
    print "Connecting to $server{'app'}: chef-client";
    while ( <$SSH1> ) {
        print $server{'app'};
        print $_;
    }
}
close $SSH1 or die $!;

if ($pid2) {
    print "Connecting to $server{'auth'}: chef-client";
    while ( <$SSH2> ) {
        print $server{'auth'};
        print $_;
    }
}
close $SSH2 or die $!;

# Run pid3 once pid2 is closed
my $pid3 = open(my $SSH3, "|ssh -t -t runuser\@$server{'auth'} \"sudo -- sh -c '$update_command'\"");
if ($pid3) {
    print "Connecting to $server{'auth'}: $update_command";
    while ( <$SSH3> ) {
        print $_;
    }
}
close $SSH3 or die $!;

# Run pid4 after all previous commands have completed.
my $pid4 = open(my $SSH4, "|ssh -t -t runuser\@$server{'varn'} \"sudo -- sh -c '$varn_command'\"");
if ($pid4) {
    print "Connecting to $server{'varn'}: $varn_command";
    while ( <$SSH4> ) {
        print $_;
    }
}
close $SSH4 or die $!;

Upvotes: 1

Related Questions