Reputation: 685
I would like to perform a certain selection which (as I tested in sqldeveloper) will retrieve a very large amount of data (~350,000 rows). Eventually, I would like to write these rows to a file.
I intend to write a PERL code that does it and I was wondering if there was a way to retrieve one row at a time, write it to my output file and continue to the next row.
That is, I would like to get the same result as my original select but in smaller chunks.
Thanks!!!
Upvotes: 0
Views: 9194
Reputation: 10666
You just need to use
$dbh->{'mysql_use_result'} = 1;
to tell the MySQL driver do not load whole result in memory.
Next you just process each result row with
while ( my $ary_ref = $sth->fetchrow_arrayref ) {
....
}
Upvotes: 2
Reputation: 43434
You said:
I was wondering if there was a way to retrieve one row at a time, write it to my output file and continue to the next row.
If you're just trying to export data to a file you should consider using SELECT ... INTO OUTFILE
.
It'll run much faster than running each of the queries.
Upvotes: 2
Reputation: 304
Create a loop
SELECT * FROM my_table LIMIT 0, 1
And then next time increase your limit with one
SELECT * FROM my_table LIMIT 1, 1
Upvotes: 0
Reputation: 57583
If you think you can pay executing your query many times, you can use
SELECT .... FROM ...
WHERE ....
LIMIT 0, 100 // index_from, amount
first time, adding amount to index next time
SELECT .... FROM ...
WHERE ....
LIMIT 100, 100
You could use LIMIT 0,1
to get one record at a time, but you have 350k records... it's crazy!!
Upvotes: 2