zokibtmkd
zokibtmkd

Reputation: 2183

PHP / MySQL if number of rows is bigger than 6400 doesn't return any results

I am using simple PHP query to fetch rows from the database, there are above 7000 rows in the table and whenever I fetch all rows, the PHP doesn't work and script dies, but when I limit the results to 6400 everything works fine. Is there any limitation in MySQL or PHP that I should be aware of? If any, where I need to configure these settings. Any help is highly appreciated. Please note that my PHP settings allow the script execution time of 1800 seconds and memory usage is set to 512MB.

Upvotes: 2

Views: 718

Answers (2)

Jovan Perovic
Jovan Perovic

Reputation: 20201

You have 2 solutions as I see it:

1) The one that @burzum suggested. Really nice although you would have to empirically establish the max size based on you server load (if it's not constant).

2) Use mysql_unbuffered_query()

mysql_unbuffered_query does have some drawbacks as described there:

The benefits of mysql_unbuffered_query() come at a cost: you cannot use mysql_num_rows() and mysql_data_seek() on a result set returned from mysql_unbuffered_query(), until all rows are fetched. You also have to fetch all result rows from an unbuffered SQL query before you can send a new SQL query to MySQL, using the same link_identifier.

But since you're dealing with large dataset it seems well justified...

Upvotes: 1

floriank
floriank

Reputation: 25698

Why do you need that much rows? I'm pretty sure php dies because it runs out of memory.

Run your query in a console to see if you get there more than 7000 rows without issues. If it returns them all there you can be sure it's php and not your database and I'm sure it is php.

For whatever you do, it would be better to loop over the data, also known as "pagination" and read it in chunks of for example 100 rows and process it, 0-100, 100-200, 200-300...

Upvotes: 2

Related Questions