Kylie
Kylie

Reputation: 11749

Laravel 3 and MYSQL memory limit

I have a problem with a query that I have in Laravel 3.

My table is only like 90,000 rows so far, which isn't big at all by MYSQL standards. I was honestly only expecting to run into this sorta problem somewhere near 10-20 Million rows.

But anyway.... I am using this table for stats, and it seems I cant query the whole table through laravel. As it reaches a memory limit.

Using...

public function action_viewcalls(){

$calls = Call::get();
var_dump($calls);

}

Returns nothing. Just a blank page.

If I limit it to just the ID, or Phone_no

 $calls = Call::get(array('id');
var_dump($calls);

Then I can get all the rows and can loop through them But if I try to get any more than 3 columns out of the five on this table. I return nothing and can't loop the results.

I don't even get an error message or anything about MYSQL memory limits.

I just get a blank white page. Even if I do...

public function action_viewcalls(){

$calls = DB::query("SELECT * FROM calls");
var_dump($calls);

}

Any ideas??? It doesn't feel like 90,000 rows should be causing problems. What could be the reason behind this?

Thanks

Upvotes: 0

Views: 978

Answers (1)

FireSBurnsmuP
FireSBurnsmuP

Reputation: 953

Like ceejayoz mentioned in the comments, it's likely PHP hit its memory limit.

This happens because when you call get() you are asking Laravel to return ALL the rows to you at once. 90k rows, depending on what the rows contain, can be a lot of memory to allocate all at once.

I ran into this problem today, with a lot fewer than 90k rows, using the Eloquent ORM and eager loading relationships on a data set of < 1000 records.

Some possible solutions to still process &/or display all your records:

  • One Laravel-y idea: use Pagination to only process and display certain portions of the data at a time. I'm not sure if this is available in Laravel 3.x, but Laravel 4 does have this available.
  • Use a loop that uses take() and skip() (Docs here at the bottom of the section) to iterate through the large set of data. You do need to be aware of the PHP execution time limit as well, though, as that can still cause problems using this solution.

    $take = 100; // adjust this however you choose
    $counter = 0; // used to skip over the ones you've already processed
    while($rows = Call::take($take)->skip($counter)->get())
    {
        // process these rows
        // ...
        $counter += $take;
    }
    
  • Another option is to simply grab the IDs initially, and load the corresponding records as you need them. For example:

    $ids = Call::get(array('id'));
    foreach($ids as $id)
    {
        // grab the actual record...
        $call = Call::find($id);
        if($call)
        {
            // do stuff with this one
        }
    }
    

    This is still subject to the max execution time problem, though, as processing that many records on a single page load is likely to exceed it, no matter what you do.

  • You could also increase PHP's maximum memory, but this might not always be a possibility in a production environment, depending on your hosting provider.
  • My suggestion: If Laravel's Pagination doesn't do what you want (as in my case), you could make your own pagination solution, using the take-skip method above, but using dynamic links that reload the page with the take and skip values in the URI. I am assuming here that you plan on showing some portion of this information to the user on the page. This has the advantage of not causing issues with PHP's max execution time or memory limits. Done right, it can work well and look good. You could even just use a <select> with a onChange handler that reloads a portion of the page using ajax, if you're feeling really ambitious.

Upvotes: 2

Related Questions