Samir B
Samir B

Reputation: 152

Best way to update lot of entries in Laravel

I have a file from my supplier which gives me all hours an update of the stocks. I want to make an update each hour in my database, the problem is that the task is too heavy, the page loads after almost 10 seconds because of it contains almost 10.000 entries.

Here is the file from my supplier:

stock.csv

ref, qty
XXX01, 5
XXX02, 10
XXX03, 3
XXX04, 8
XXX05, 6
...

I get this file and convert it into array with this fabulous library : Laravel Excel the result:

Array
0 => ['ref' => XXX01, 'qty' => 5]
1 => ['ref' => XXX02, 'qty' => 10]
2 => ['ref' => XXX03, 'qty' => 3]
...


Then i make a foreach loop to update each item in my database
My items table is like that:

id, ref, (...), qty
1, XXX01, 0
2, XXX02, 3
3, XXX03, 1
4, XXX04, 2
5, XXX05, 0

here is my code:

// convert .csv into Array
$path = public_path('stock').'/stock.csv';
Excel::setDelimiter(',');
$data = Excel::load($path, function($reader){})->get()->toArray();
// make it in Laravel Collection
$stocks = new Collection($data);

// The treatment
foreach ($stocks as $stock){
    $item = Item::where('ref', $stock['ref'])->first();
    $item->qty = $stock['qty];
    $item->save();
}

// Too long...

Is there an other option in Laravel to make it more fastest?

Upvotes: 1

Views: 2414

Answers (3)

Samir B
Samir B

Reputation: 152

thank you @toyi & @Devon

here is my new code

$path = public_path('stocks').'/tyce_stocks.csv';
Excel::setDelimiter(',');
$data = Excel::load($path, function($reader){})->get()->toArray();
$stocks = new Collection($data);

try{
    DB::transaction(function() use ($stocks) {
        foreach ($stocks as $stock){
            DB::table('items')
                ->where('ref', $stock['ref'])
                ->update(['qty' => $stock['qty]]);
        }
    });
}catch(\Throwable $e){
    // send mail with error
};

Its more faster, the page doesnt crash, and i make in in a job schedule thats runs every day at 1:00am and 1:00pm

$schedule->command('stock:update')->twiceDaily(1, 13);

Thanks !

Upvotes: 1

Anthony Aslangul
Anthony Aslangul

Reputation: 3847

As Devon mentionned, you can update directly without retrieving the records. Another way to significantly improve the update speed is with a transaction.

You can achieve this by doing a

try{
    DB::transaction(function(){
        //make your updates here
    });
}catch(\Exception $e){
    //gone wrong...
});

It will automatically commit (=save your changes) if no exception is thrown, else it will rollback your changes.

There is two major advantages here.

First, everything that is related to database writing (insert / update) will be much faster inside the transaction() method.

Second advantage, if there is an error (and in my experience, mass insert/update via files can often go wrong), your changes will not be commited: they will not be saved into the database.

You can find Laravel's documentation about transactions here.

EDIT

Since we are mainly talking about pure speed, i didn't covered queued jobs, but a file import / export with a lot of data fits pretty well the purpose of this feature. Your script will not be faster to execute but you'll not get stuck waiting for it to finish, it'll be run aside in a different process. The documentation about jobs is available here

Upvotes: 2

Devon Bessemer
Devon Bessemer

Reputation: 35337

Updating the rows without retrieving the records would definitely save a lot of queries:

foreach ($stocks as $stock){
    Item::where('ref', $stock['ref'])->update(['qty' => $stock['qty']]);
}

If you have often have a lot of the same quantity, it may be faster to group them:

foreach($stocks->groupBy('qty') as $qty => $stocks) {
    Item::whereIn('ref', $stocks->pluck('ref'))->update(['qty' => $qty]);
}

Upvotes: 4

Related Questions