Reputation: 19507
Say I have a simple query to get all rows from a table:
App::make('My_Model')->get();
But, I have a huge number of rows - say 100,000, so many that PHP runs out of memory trying to read them all in.
For some reason I want to go through each of these rows, and do some operation based on them.
Is there any way to tell laravel to give me one row at a time, the way the old mysql_fetch_row worked?
Note: I'm currently solving this by using limit() commands to get 5000 rows at once, which works fine, except that it repeats the query 20 times.
I want to know if there is any built in method in laravel or eloquent to give me the next row, instead of all rows ?
Upvotes: 8
Views: 6748
Reputation: 81
For anybody landing here using Laravel 5.2+ (like I did) There is now a 'cursor' method available which returns a Generator
https://laravel.com/api/5.2/Illuminate/Database/Query/Builder.html#method_cursor
So instead of ->get() use ->cursor()
Upvotes: 8
Reputation: 770
What kind of operation are you trying to perform? The way you describe it you are loading all 100,000 records into active memory, but there is almost no reason you should need to do that unless you actually want to display all 100,000 records.
Give us more information, what are you actually trying to accomplish?
Upvotes: 0
Reputation: 87769
The problem is, what is the next row in an application that lives shortly? Unless you are building a command line application that has a cursor, you application lives just for a request. Laravel can give you ways to get a subset of your cursor, so if you need 5000 lines, you can get those 5000 lines at the same time.
So you can use paginate:
$someUsers = User::where('votes', '>', 100)->paginate(5000);
Take and skip:
$page = 5;
$pageSize = 5000;
$users = User::skip($page * $pageSize)->take($pageSize)->get();
Chunk, internally uses paginagion, but it's a little bit restricting, because it will force you to process your whole cursor:
User::chunk(5000, function($users)
{
/// every time this clojure is ran you get a new subset of 5000 rows.
foreach ($users as $user)
{
// do whatever you need with them
}
});
Sidenote: you can save memory by turning query log off:
DB::disableQueryLog();
Upvotes: 1
Reputation: 10794
If you want to process them, a thousand at a time say, the chunk()
method in Eloquent
should work for you:
My_Model::chunk(1000, function($my_models)
{
foreach ($my_models as $my_model)
{
// do something
}
});
More in the Eloquent docs
Depends on what you are doing to your data...
Upvotes: 5