Reputation: 749
I'm using Perl to populate two html tables (each table is generated in its own Perl CGI script). I have a 'helper' script that runs an SQL query. Here's an example (let's call this script run_sql_query.pl
):
my $sql_query = "SELECT ID from TABLE where ID > 3";
our $sth = $dbh->prepare($sql_query);
$sth->execute;
Then, in each of my two Perl CGI scripts, I do:
require 'run_sql_query.pl';
our $sth;
while (my ($table_id) = $sth->fetchrow_array) {
...
}
However, it looks like run_sql_query.pl
is running twice (once in each of the Perl CGI scripts). How can I have it so run_sql_query.pl
gets executed only once and then use the same $sth
contents in both of the Perl CGI scripts?
Upvotes: 3
Views: 724
Reputation: 164769
If I understand correctly, you want to save the result of a query so it can be used again by a different process. There's several ways of doing this, all involving setting up a cache of some sort.
CGI processes have separate memory, so you can't do what you ask directly. Worse, they're started and stopped for each request wasting a lot of time compiling. However, there's lots of ways around that.
If your SQL query is expensive, and the data does not change often, you can have the database cache the result. Here's the MySQL cache, for example. This has the advantage that everything which uses the database will be using it. The disadvantage is your programs still have to fetch and process all the data. Use this is your query is expensive, but fetching and processing it is not, and your data does not change often.
Another option is to set up an external cache, such as memcached, to store the results. Memcached is basically a server on your machine that stores a big hash of data. Or you can even use the database itself to cache the processed HTML table. Each process can check if the data they want is in the cache before calculating it themselves. You can use something like CHI to interface with it. The advantage is you can chuck anything you want into the cache. The disadvantages are that you have to manage the cache yourself (ie. delete data when it goes out of data), it's another thing to manage, and the cache could disappear at any time.
Finally, you can change how your processes are run. A basic efficiency gain is to change from plain CGI, which starts, compiles, runs and finishes your program for every request, to something like FastCGI which runs your program in its own little server. Think of it like putting a while
loop around your whole program. Now since your program isn't shutting down between each request, you can store data in global variables. Then its a simple matter to do what @foampile suggested, store your table in our @Table_Cache
or the like. This will cache it for that process but not for others. Each different program is its own process. So foo.cgi
won't share with bar.cgi
. The advantages are its easy, FastCGI is something you should be switching to anyway, and your web server probably already supports it. The disadvantages are that you need to manage the cache (ie. delete outdated entries), and it doesn't share across processes.
I'd start with FastCGI and see if that fixes your performance problems. It means you'll be running each query only twice per server restart instead of at every request.
Upvotes: 7
Reputation: 385657
There's no way to truly share a Perl variable between processes (or even threads).
But even if you could, the whole concept makes no sense. The database would have no way of knowing it's talking to two clients, so one client would end up with some of the records, and the other would end us with some of the others.
You have to fetch the data twice. You could store the data in some intermediate place and have two clients fetch it from there, but you haven't indicated any need for that.
Upvotes: -1
Reputation: 31222
You should feed the query results into a hash of hashes or a data structure the first time you iterate through the record set and then subsequent times just go through or reference the hash/structure:
my $dataStruct;
while (my ($table_id) = $sth->fetchrow_array) {
my $colVal = $sth->data[...];
$dataStruct->{$table_id}->{'col'} = $colVal;
}
Then later you iterate through $dataStruct. Let me know if you need instruction how to do that.
Upvotes: 1