Reputation: 65
I have a performance issue in doing bulk insert to DB using Laravel 5.2. I get the data from server1, format the data and do a bulk insert into server2. Initially, the performance seems to be good, but as the records increase the insert takes time.
How to increase the performance?
My Controller is:
public function oceanupload(){
$this->getRejectedR11($code,$countryName,$type);
}
function getRejectedR11($code,$countryName,$type){
$count = RejectedR11::count();
$chunksize = 30000;
$chunks = floor($count / $chunksize);
for ($chunk = 0; $chunk <= $chunks; $chunk++) {
$offset = $chunk * $chunksize;
$data = RejectedR11::skip($offset)->take($chunksize)->get();
if(!empty($data)){
OceanUpload::insertRejected($data,$code,$type);
//RejectedR11::deleteRejectedR11($data);
}
}
}
RejectedR11 Model:
<?php
namespace App\Models\Soa\Ocean;
use Illuminate\Database\Eloquent\Model;
use DB;
class RejectedR11 extends Model{
protected $table= "R11RejectedData";
public $primaryKey = "R11RejectedDataID";
public $connection = "ocean";
public static function deleteRejectedR11($data){
for($i=0;$i<count($data);$i++){
DB::connection('ocean')->statement('DELETE FROM R11RejectedData WHERE R11RejectedDataID = '.$data[$i]['R11RejectedDataID']);
}
}
}
?>
OceanUpload Model:
<?php
namespace App\Models\SOA;
use Illuminate\Database\Eloquent\Model;
use Session;
use DB;
class OceanUpload extends Model{
public $table = 'fanda_soa_ocean';
public $primaryid = 'ocean_id';
public $timestamps = false;
protected $connection = 'main_db';
public static function insertRejected($data,$code,$type){
$res = array();
if(!empty($data)){
for($i=0;$i<count($data);$i++){
$invodate = explode(" ",str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['InvoiceDate']) ))));
$duedate = isset($data[$i]['DueDate'])?explode(" ",str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['DueDate']) )))):"-";
$res[] = array(
'ocean_countrycode'=>$code,
'ocean_countrystruct'=>$type,
'ocean_status'=>"Rejected",
'ocean_vendorname'=>isset($data[$i]['VendorName'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['VendorName']) ))):"-",
'ocean_vendornum'=>isset($data[$i]['VendorNum'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['VendorNum']) ))):"-",
'ocean_vendorsite'=>isset($data[$i]['VendorSiteCode'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['VendorSiteCode']) ))):"-",
'ocean_currency'=>isset($data[$i]['InvoiceCurrencyCode'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['InvoiceCurrencyCode']) ))):"-",
'ocean_invoicenum'=>isset($data[$i]['InvoiceNum'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['InvoiceNum']) ))):"-",
'ocean_invoiceamt'=>isset($data[$i]['InvoiceAmountSUM'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['InvoiceAmountSUM']) ))):"-",
'ocean_invodate'=>date('Y-m-d',strtotime($invodate[0])),
'ocean_termsdate'=>isset($data[$i]['TermsDate'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['TermsDate']) ))):"-",
'ocean_approvstatus'=>isset($data[$i]['ApprovalStatus'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['ApprovalStatus']) ))):"-",
'ocean_approvname'=>isset($data[$i]['ApproverName'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['ApproverName']) ))):"-",
'ocean_approvdate'=>isset($data[$i]['ApprovalActionDate'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['ApprovalActionDate']) ))):"-",
'ocean_duedate'=>($duedate <> "-")?date('Y-m-d',strtotime($duedate[0])):"-",
'ocean_invocurrcode'=>isset($data[$i]['InvoiceCurrencyCode'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['InvoiceCurrencyCode']) ))):"-",
'ocean_imagerefnum'=>isset($data[$i]['ImagingRefNumber'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['ImagingRefNumber']) ))):"-",
'ocean_reasonlabel'=>isset($data[$i]['ReasonLabel'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['ReasonLabel']) ))):"-",
'ocean_reasoncode'=>isset($data[$i]['ReasonCode'])?str_replace('?','',trim(preg_replace("/[^a-zA-Z0-9_.\s()+-@:\"\'\\\\\&-]+/", "",preg_replace('/_x000D_/', '', $data[$i]['ReasonCode']) ))):"-",
'ocean_recent'=>1,
'ocean_uploadby'=>Session::get('login_id'),
'ocean_uploadon'=>date('Y-m-d'),
);
foreach($res as $val):
DB::statement('UPDATE '.'fanda.fanda_soa_ocean'.' SET ocean_recent = 0 where ocean_countrycode = "'.$val['ocean_countrycode'].'" and ocean_vendorname = "'.$val['ocean_vendorname'].'" and ocean_invoicenum = "'.$val['ocean_invoicenum'].'"');
DB::statement('INSERT IGNORE INTO '.'fanda.fanda_soa_ocean'.' ('.implode(',',array_keys($val)).') values (?'.str_repeat(',?',count($val) - 1).') ON DUPLICATE KEY UPDATE ocean_recent = 1',array_values($val));
endforeach;
}
}
}
}
?>
Upvotes: 0
Views: 728
Reputation: 11
I am really very far from Laravel but recently had to make potentially huge update request to MySQL DB in OctoberCMS (which is based on Laravel). So here is my humble example of fast function using "insert on duplicate key update" with possibility to insert multiple entries at once:
class MyModel extends Model
{
...
/**
* @param array $columns Columns definition (map), e.g.: ['id', 'date', 'col2', [...]]
* @param array $sets Actual data for insert [[1, '2018-03-01', 'someVal'], [...]]
* @return bool
* @throws \Exception
*/
public static function onDuplicateKeyUpdate(array $columns, array $sets = [])
{
$colCount = count($columns);
if(!$sets || $colCount < 2){
return false;
}
$model = new static();
$keyName = $model->getKeyName();
if(array_key_exists($keyName, $columns)){
throw new \Exception('Columns should contain key `' . $model->getKeyName() . '`');
}
$model->addFillable([$keyName]);
$sqlParams = $sqlVals = $sqlCols = [];
foreach($sets as $k => $attributes){
if(array_diff_key($columns, $attributes)){
throw new \Exception('Invalid attributes provided. Should be: ' . print_r($columns, true) . ' Provided: ' . print_r($attributes, true));
}
$model->fill(array_combine($columns, $attributes));
if(count($attributes) != count($model->getAttributes())){
throw new \Exception('Impossible to process some columns: ' . print_r($columns) . '`');
}
$model->validate();
$sqlVals[] = '(?' . str_repeat(',?', $colCount - 1) . ')';
array_push($sqlParams, ...$attributes);
}
foreach($columns as $column){
if($column !== $keyName){
$sqlCols[] = '`' . $column . '`=VALUES(`' . $column . '`)';
}
}
$query = $model->newBaseQueryBuilder();
$grammar = $query->getGrammar();
$connection = $query->getConnection();
$table = $grammar->wrapTable($model->getTable());
$columns = ' (' . $grammar->columnize($columns) . ')';
return $connection->insert('INSERT INTO ' . $table . $columns . ' VALUES ' . implode(',', $sqlVals) . ' ON DUPLICATE KEY UPDATE ' . implode(', ', $sqlCols), $sqlParams);
}
}
Idea here is using maximum of model features such as types conversion, checking if column is allowed to be updated, etc. Also used some laravel techniques such wrapping table and columns.
I wrote it being impressed by this answer: https://stackoverflow.com/a/35843210/9924742
Upvotes: 1
Reputation: 25936
Your code was getting slower because you were running the UPDATE
and INSERT
queries for all items in $res
every time you added a new item. So you were running the same queries over and over again.
You could improve your DELETE
queries by collecting the ids and running one query at the end:
DELETE FROM R11RejectedData WHERE IN ($ids)
You could improve your INSERT
and UPDATE
queries by using prepared statements:
$sql = 'INSERT INTO table1 (column1, ...) VALUES (?, ...)';
$statement = DB::getPdo()->prepare($sql);
foreach(...) {
$statement->execute([$value1, ...]);
}
$statement->closeCursor();
Upvotes: 3