Reputation: 43
my $pm = new Parallel::ForkManager(4);
foreach my $array (@lines) {
$pm->start and next;
$cmd = 'command';
print "\n$cmd\n\n";
exec($cmd);
$pm->finish; }
$pm->wait_all_children;
As you can see my code runs 4 things at once. It's ffmpeg piping video to x264. Its output is messy and jumps around on a single line between the 4 outputs. Is there a way to totally run these in the background and redirect its output so I can cleanly print and update the 4 separate outputs? It would be nice so I could tell how far along each process is.
If it absolutely can't be done in perl, I would gladly accept any help given to direct me to another language that would make this easier. This is under Linux, by the way. Thank you.
Open2 is way beyond me. How would I use this? I can't grasp how I will be able to print the progress of each thing without making new lines. I want to print the STDERR and STDOUT of whatever each process is doing, and when it ends, keep that as a line that doesn't update. It's not a good explanation, but I don't know how else to explain what I want. Basically, the first 4 jobs will have 4 lines constantly refreshing. Then when one of those jobs is done, add a new line for the new job, and maybe somehow indicate that the done job is done.
I tried a quick test with "open" and it still outputs to the shell. This is in Windows, but it should still behave the same. Is this going to even be possible with Perl, or even in a shell?
Hello? I still need help with this...
Upvotes: 3
Views: 909
Reputation: 66967
If you want to capture each process's STDOUT, you can use open
instead of exec
to run your subprocesses.
foreach my $array(@lines) {
$pm->start and next;
my $cmd = 'command';
open my $cmd_out, '-|', $cmd or die "Can't start process: $!";
# read from command output
while( my $line = <$cmd_out> ) {
# do something with output
}
$pm->finish;
}
If you need to capture both STDOUT and STDERR, see IPC::Open2. There's also IPC::Open3 if you need a handle for STDIN as well.
Upvotes: 2