Nikos C.
Nikos C.

Reputation: 51850

FastCGI doesn't execute scripts in parallel

I'm still new to FastCGI. I can't get my head around on how to execute Perl scripts in parallel.

The problem is that when I open the same URL multiple times in a browser, for example:

http://example.com/myscript.pl

which just contains a busy loop:

#! /usr/bin/perl

for (my $i = 0; $i <= 700000000; $i++) { }

then only one FCGI process is running, using 100% CPU, even though my webserver is spawing 8 of them (I'm using lighttpd.)

However, if I copy the script to another file, and then open them both:

http://example.com/myscript.pl
http://example.com/myscript_copy.pl

Then two processes are running, 50% CPU each. I just don't understand that.

My Perl FastCGI dispatch script (meaning the script that the webserver spawns 8 times), is this:

#! /usr/bin/perl

use CGI::Fast;

{
    while (new CGI::Fast) {
        do $ENV{SCRIPT_FILENAME};
    }
}

One solution would be to spawn a new Perl process for every script that is requested by the clients. But then this would defeat the purpose of FastCGI; I don't want to spawn dozens of Perl processes per second for every incomming request. That's way too much overhead.

How's Perl supposed to work with FastCGI? I must be doing something fundamentally wrong here...

In case my lighttpd configuration is important (maybe there's some mistake there), it's this:

server.modules = (
    "mod_access",
    "mod_alias",
    "mod_compress",
    "mod_redirect",
    "mod_fastcgi",
)

server.document-root = "/var/www"
server.upload-dirs   = ( "/var/cache/lighttpd/uploads" )
server.errorlog      = "/var/log/lighttpd/error.log"
server.pid-file      = "/var/run/lighttpd.pid"
server.username      = "www-data"
server.groupname     = "www-data"

index-file.names     = ( "index.php", "index.html",
                         "index.htm", "default.htm",
                         " index.lighttpd.html" )

url.access-deny      = ( "~", ".inc" )

static-file.exclude-extensions = ( ".php", ".pl", ".fcgi" )

include_shell "/usr/share/lighttpd/use-ipv6.pl"

dir-listing.encoding = "utf-8"
server.dir-listing   = "enable"

compress.cache-dir = "/var/cache/lighttpd/compress/"
compress.filetype  = ( "application/x-javascript", "text/css",
                       "text/html", "text/plain" )

include_shell "/usr/share/lighttpd/create-mime.assign.pl"
include_shell "/usr/share/lighttpd/include-conf-enabled.pl"

fastcgi.server = (
    ".pl" => ((
        "socket" => "/var/run/lighttpd/perl-fcgi.socket",
        "bin-path" => "/usr/local/lib/cgi-bin/perl-dispatcher.fcgi",
        "check-local" => "disable",
        "max-procs" => 8,
    ))
)

Upvotes: 1

Views: 2327

Answers (1)

Dave Sherohman
Dave Sherohman

Reputation: 46187

One solution would be to spawn a new Perl process for every script that is requested by the clients. But then this would defeat the purpose of FastCGI; I don't want to spawn dozens of Perl processes per second for every incomming request.

You appear to have misunderstood FastCGI. Each process still only handles a single request at a time, so you can't handle dozens of requests simultaneously without creating dozens of processes to handle them.

What FastCGI actually gives you is that it can keep processes around to be reused for a later request or it can be configured to spawn processes in advance, before an actual request comes in that requires them. Both of these features help to minimize the need to spawn processes on-demand while a client is waiting for a request to be handled, but they do not change the need to have one server process to handle each incoming request.

Upvotes: 1

Related Questions