Reputation: 480
I am connecting to a mysql database using the following code:
my $dbh = DBI->connect("DBI:mysql:test:localhost", $user, $pass)
or die $DBI::errstr;
my $sqlQuery = $dbh->prepare($query)
or die "Can't prepare $query: $dbh->errstr\n";
my $rv = $sqlQuery->execute
or die "can't execute the query: $sqlQuery->errstr";
while (my @row= $sqlQuery->fetchrow_array()) {
# do something;
}
My doubt is: It is fine till the time my application is interacting with small DBs. But when I move this application to a live environment where the DB size may be in 100s of GBs, what performance issues can this code cause. Effectively what I am asking is, at the line -
@row= $sqlQuery->fetchrow_array();
Will Perl copy the entire table contents and dump it into the variable. If yes, won't it cause significant performance issues for my application as well as the database server?
Upvotes: 5
Views: 227
Reputation: 5222
In the line:
@row= $sqlQuery->fetchrow_array();
One row ata time will be returned by the database to perl, if interacting with a massive database you will not dump the entire result set of the query to a variable.
$arrRef = $sqlQuery->fetchall_arrayref();
On the other hand..
Upvotes: 3