Reputation: 5828
In my Perl/CGI web application, I sometimes need to run a long process which makes the wait for the next page interminable. So I've been disabling the buffer as below so that the page contents get sent before the long process runs.
local $| = 1;
print "Content-type: text/html\n\n";
print $output;
&background_process();
However it seems to me that the buffer has its uses and I should not be in the habit of doing this. Is there a better way to run a long process and still return html to the client quickly? Should I be forking or somesuch?
Upvotes: 1
Views: 359
Reputation: 182782
Here is some code I use to spawn off a background process in my FastCGI script:
$SIG{CHLD} = 'IGNORE';
# This should flush stdout.
my $ofh = select(STDOUT);$| = 1;select $ofh;
my $kpid = fork;
if ($kpid)
{
# Parent process
waitpid($kpid, 0);
}
else
{
close STDIN;
close STDOUT;
close STDERR;
setsid();
my $gpid = fork;
if (!$gpid)
{
open(STDIN, "</dev/null") ;
open(STDOUT, ">/dev/null") ;
open(STDERR, ">/dev/null") ;
# Child process
exec($pgm, @execargs) ;
}
exit 0;
}
A couple of salient points here:
Upvotes: 4
Reputation: 91922
The best way is to fork your process and put it in the background. That way it won't be stopped by the user leaving the page, and the page will not be in a loading state in the web browser all the time.
Upvotes: 4