Reputation: 37
I'm trying to fetch XML from URL and then update it in the database using Laravel, so that's the way I do this
public function transaction_update($data,$id_site)
{
DB::beginTransaction();
try {
$new = DB::table('rates')
->where('id_site', $id_site);
foreach ($data as $item) {
$data = $this->ratesService->generate_array($item);
$data['id_site'] = $id_site;
//$data = array_filter($data)
$new->where('from',$data['from'])
->where('to', $data['to'])
->update($data);
}
DB::commit();
} catch (\Throwable $e) {
DB::rollback();
return $e;
}
}
For example this URL https://newline.online/exportxml.xml
contains 6k XML items that I need to update in my database and it takes about 3 minutes to do this, how can I increase the speed?
DB structure
CREATE TABLE `rates` (
`id_site` int(11) NOT NULL,
`from` varchar(20) NOT NULL,
`to` varchar(20) NOT NULL,
`in` float NOT NULL,
`out` float NOT NULL,
`r1` float NOT NULL,
`r2` float NOT NULL,
`real_r1` float NOT NULL,
`real_r2` float NOT NULL,
`amount` float NOT NULL,
`param` varchar(20) NOT NULL,
`minamount` varchar(20) NOT NULL,
`maxamount` varchar(20) NOT NULL,
`fromfee` varchar(20) NOT NULL,
`tofee` varchar(20) NOT NULL,
`city` varchar(50) NOT NULL,
`status` int(11) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;
Upvotes: 1
Views: 433
Reputation: 3835
Increasing speed usually means removing bottlenecks. To do so you first need to identify them. Personally I would split up the import into multiple steps:
This way you can run multiple workers for the 3rd process, which will increase overal the performance as I'm guessing the database import is the bottleneck. The database locking might pose an issue for these multiple processes, in that case you'll have to work around that somehow.
Importing 6k records is quite a heavy thing to do by the way. So you should not be expecting the import process to drop below a minute (rough estimate).
Other things you can do to increase the performance:
Upvotes: 1