Reputation: 36696
I want to iterate over a big table that don't fits in the memory.
In Java, I can use a cursor and load the contents as needed and not overflow the memory. How I do the same with Perl?
The database I'm using is PostgreSQL and DBI.
Upvotes: 0
Views: 2938
Reputation: 36696
I used a PostgreSQL cursor from PostgreSQL database.
my $sql = "SOME QUERY HERE";
$dbh->do("DECLARE csr CURSOR WITH HOLD FOR $sql");
my $sth = $dbh->prepare("fetch 100 from csr");
$sth->execute;
while(my $ref = $sth->fetchrow_hashref()) {
//... - processing here
if ($count % 100 == 0){
$sth->execute;
}
}
Upvotes: 4
Reputation: 5992
What's wrong with:
my $s = $h->prepare(select ...);
$s->execute;
while(my $row = $fetchrow_arrayref) {
; # do something
}
Upvotes: 1
Reputation: 7228
Look at DBD::Pg docs for an example.
Use DBI fetchrow_* functions in a while() loop for smaller memory allocation, avoid fetchall_*.
Other database options related to memory usage:
Upvotes: 0
Reputation: 127136
Just use a database cursor in PostgreSQL. An example from the manual:
BEGIN WORK;
-- Set up a cursor:
DECLARE liahona SCROLL CURSOR FOR SELECT * FROM films;
-- Fetch the first 5 rows in the cursor liahona:
FETCH FORWARD 5 FROM liahona;
code | title | did | date_prod | kind | len
-------+-------------------------+-----+------------+----------+-------
BL101 | The Third Man | 101 | 1949-12-23 | Drama | 01:44
BL102 | The African Queen | 101 | 1951-08-11 | Romantic | 01:43
JL201 | Une Femme est une Femme | 102 | 1961-03-12 | Romantic | 01:25
P_301 | Vertigo | 103 | 1958-11-14 | Action | 02:08
P_302 | Becket | 103 | 1964-02-03 | Drama | 02:28
-- Fetch the previous row:
FETCH PRIOR FROM liahona;
code | title | did | date_prod | kind | len
-------+---------+-----+------------+--------+-------
P_301 | Vertigo | 103 | 1958-11-14 | Action | 02:08
-- Close the cursor and end the transaction:
CLOSE liahona;
COMMIT WORK;
Upvotes: 4