Matthieu Napoli
Matthieu Napoli

Reputation: 49713

How to paginate a large query correctly with PDO to avoid "out of memory" errors?

I have a very large table which I want to process row by row in PHP.

Here is what I have tried:

So my questions:

Or do you see another better solution?

Upvotes: 1

Views: 2806

Answers (1)

stepozer
stepozer

Reputation: 1191

Probably I don't understand your question... But, i created simple script that iterates all values from table with ~ 5,436,226 rows (and 19 columns), and save it into out file. I used PostgeSQL instead MySQL (but I do not think that this is a problem, you must change only LIMIT section).

<?
ini_set('memory_limit', '100M');

$pdo  = new PDO('pgsql:host=localhost;port=5432;dbname=test', 'postgres', 'postgres');
$page = 0;
while ($pdo) {
    echo ($page++).PHP_EOL;
    $stmt = $pdo->prepare('SELECT * FROM table ORDER BY id LIMIT 100 OFFSET '.($page*100));
    $stmt->execute();
    $rows = $stmt->fetchAll(PDO::FETCH_ASSOC);
    file_put_contents('/var/www/test/tmp/out.txt', json_encode($rows), FILE_APPEND);
}

Out file size is ~1 Gb.

Upvotes: 2

Related Questions