Reputation:
I am running multiple batch files in remote machine using a Perl script residing in the local machine and I want to run this batch files for a long duration.
The problem is, the Perl program which is running in the local machine is halting and executing the later commands only after the batch files ends.
I want to run batch files in remote machine and I want to execute the rest of the commands in my Perl script without halting.
Please help me out.
Upvotes: 0
Views: 1064
Reputation: 19759
Yes you could use fork but I think a better solution would be to have a script at the remote machine which accepts a batch job and returns its id.
Also,the current status of a submitted job can be retrieved by using the same script.
This way the client(i.e your machine) would be independent of managing jobs.
Upvotes: 1
Reputation: 66967
How are you running the remote processes? The best answer will probably depend on the specific implementation. But assuming you're using something like Net::SSH or Expect or some sort of RPC mechanism, the easiest thing is probably to fork
a new process to run the remote job and then continue on with your script.
my $pid = fork;
if ( ( defined $pid ) and $pid == 0 ) {
# child process
do_remote_batch_jobs();
} elsif ( defined $pid ) {
# parent process
do_other_stuff();
} else {
# fork error
die "Unable to fork: $!";
}
Upvotes: 2