Reputation: 35087
I have tried with Perl fork manager and DBI . But i got the error DBD::mysql::st execute failed: Lost connection to MySQL server during query .
Here the sample code: I want make query between low to high value (i have spitted int 10k records)
use Parallel::ForkManager;
my $pm = new Parallel::ForkManager(50);
my $db = krish::DB->new or die $!; # its has all connection details
while ( $low < $high ) {
# Some value manipulation
my $pid = $pm->start and next;
#db_execution returns execution
while ( my $sth = db_execution ( $db, $low , $high ) ) {
...
#fetch row operation
...
}
$pm->finish;
}
sub db_execution {
...
my $dbh = $db->connect( 'students' ) or die $!;
my $sth = $dbh->prepare( $sql ) or die "$!:" . $dbh->errstr;
$sth->execute or die "$!:" . $sth->errstr;
...
}
The same code is executing with out parallel processing. What is the issue? How to resolve is this?
Upvotes: 5
Views: 3309
Reputation: 4778
When you share database connections between processes (which is what you're doing with a fork) you need to make sure that one process doesn't close it out from under the other one. Because the connections are also variables, when the Perl interpreter shuts down it will call the DESTROY method of that object which in this case will close the connection.
So if any of the children close the db connection (which will happen when they finish and shutdown) it will kill it out from under the parent process. The way to prevent this is to set InactiveDestroy
to true in the parent process before the fork and then to close the connection explicitly in the parent when done.
https://metacpan.org/pod/DBI#InactiveDestroy
Upvotes: 9
Reputation: 6524
You are asking for trouble by using the same db handle simultaneously in all of the child processes. You should be creating a new connection in each child.
Never mind...I read the rest of the code.
Upvotes: 0