Razor Storm
Razor Storm

Reputation: 12336

perl fork doesn't work properly when run remotely (via ssh)

I have a perl script, script.pl which, when run, does a fork, the parent process outputs its pid to a file then exits while the child process outputs something to STOUT and then goes into a while loop.

$pid = fork();

if ( ! defined $pid ) 
{
    die "Failed to fork.";
}
#Parent process
elsif($pid)
{
    if(!open (PID, ">>running_PIDs"))
    {
        warn "Error opening file to append PID";
    }
    print PID "$pid  \n";
    close PID;
}
#child process
else
{
    print "Output started";

    while($loopControl)     
    {
           #Do some stuff
    }
}

This works fine when I call it locally ie: perl script.pl.

The script prints out some things then returns control back to the shell. (while the child process goes off into its loop in the background).

However, when I call this via ssh control is never returned back to the shell (nor is the "Output started" line ever printed.

ie: $ ssh [email protected] 'perl script.pl'

However, the interesting thing is, the child process does run (I can see it when I type ps).

Can anyone explain whats going on?

EDIT:

I ran it under debug and got this:

### Forked, but do not know how to create a new TTY.

Since two debuggers fight for the same TTY, input is severely entangled.

I know how to switch the output to a different window in xterms and OS/2 consoles only. For a manual switch, put the name of the created TTY in $DB::fork_TTY, or define a function DB::get_fork_TTY() returning this.

On UNIX-like systems one can get the name of a TTY for the given window by typing tty, and disconnect the shell from TTY by sleep 1000000.

Upvotes: 6

Views: 2263

Answers (4)

JohnGH
JohnGH

Reputation: 843

To understand this better I would recommend reading @Jax's solution on

Getting ssh to execute a command in the background on target machine

It's not to do with Perl. It's becaue of the way SSH handles any long-running process you're trying to background.

I need to launch script.pl from a bash script (to define essential local variables on the target host):

$ ssh [email protected] /path/to/launcher.sh

/path/to/launcher.sh was invoking the Perl script with:

CMD="/path/to/script.pl -some_arg=$VALUE -other_arg"
$CMD &

which worked locally, but when run via ssh it didn't return.

I tried @pra's solution inside the Perl script, but it didn't work in my case.

Using @Jax's solution, I replaced $CMD & with this:

nohup $CMD > /path/to/output.log 2>&1 < /dev/null &

and it works beautifully.

Upvotes: 0

pra
pra

Reputation: 8629

Whenever you launch background jobs via non-interactive ssh commands, you need to close or otherwise tie off stdin, stdout, & stderr. Otherwise ssh will wait for the backgrounded process to exit. FAQ.

This is called disassociating or detaching from the controlling terminal and is a general best practice when writing background jobs, not just for SSH.

So the simplest change that doesn't mute your entire command is to add:

#close std fds inherited from parent
close STDIN;
close STDOUT;
close STDERR;

right after your print "Output started";. If your child process needs to print output periodically during its run, then you'll need to redirect to a log file instead.

Upvotes: 9

ninjalj
ninjalj

Reputation: 43718

What is happening is that ssh is executing 'perl script.pl' as a command directly. If you have 'screen' available, you could do:

$ ssh [email protected] 'screen -d -m perl script.pl'

to have it running on a detached screen, and reattach later with screen -r

Upvotes: 2

Brian Roach
Brian Roach

Reputation: 76908

ssh [email protected] 'nohup perl script.pl'

You aren't able to exit because there's still a process attached. You need to nohup it.

Upvotes: 3

Related Questions