DispooL
DispooL

Reputation: 37

10k updates Laravel as fast as possible

I'm trying to fetch XML from URL and then update it in the database using Laravel, so that's the way I do this

public function transaction_update($data,$id_site)
    {
       DB::beginTransaction();
        try {
            $new = DB::table('rates')
                ->where('id_site', $id_site);

            foreach ($data as $item) {
                $data = $this->ratesService->generate_array($item);
                $data['id_site'] = $id_site;
                //$data = array_filter($data)
                $new->where('from',$data['from'])
                ->where('to', $data['to'])
                    ->update($data);
            }
            DB::commit();
        } catch (\Throwable $e) {
            DB::rollback();
            return $e;
        }
    }

For example this URL https://newline.online/exportxml.xml contains 6k XML items that I need to update in my database and it takes about 3 minutes to do this, how can I increase the speed?

DB structure

CREATE TABLE `rates` (
  `id_site` int(11) NOT NULL,
  `from` varchar(20) NOT NULL,
  `to` varchar(20) NOT NULL,
  `in` float NOT NULL,
  `out` float NOT NULL,
  `r1` float NOT NULL,
  `r2` float NOT NULL,
  `real_r1` float NOT NULL,
  `real_r2` float NOT NULL,
  `amount` float NOT NULL,
  `param` varchar(20) NOT NULL,
  `minamount` varchar(20) NOT NULL,
  `maxamount` varchar(20) NOT NULL,
  `fromfee` varchar(20) NOT NULL,
  `tofee` varchar(20) NOT NULL,
  `city` varchar(50) NOT NULL,
  `status` int(11) NOT NULL
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

Upvotes: 1

Views: 433

Answers (1)

PtrTon
PtrTon

Reputation: 3835

Increasing speed usually means removing bottlenecks. To do so you first need to identify them. Personally I would split up the import into multiple steps:

  1. Downloading the XML file
  2. Splitting up the XML file in chunks
  3. Importing the chunks into the database

This way you can run multiple workers for the 3rd process, which will increase overal the performance as I'm guessing the database import is the bottleneck. The database locking might pose an issue for these multiple processes, in that case you'll have to work around that somehow.

Importing 6k records is quite a heavy thing to do by the way. So you should not be expecting the import process to drop below a minute (rough estimate).

Other things you can do to increase the performance:

  • Increase server CPU power
  • Increase server memory (only if this becomes a bottleneck)
  • Reduce the amount of queries per update
  • Possibly switch to other ways of storage like NoSQL
  • Think of other ways to import like:
    1. storing the latest xml
    2. comparing it to the downloaded xml
    3. only importing the difference
  • Check for bottlenecks and figure out how to remove them (as already mentioned)
  • As the guy below mentioned you could also split the update queries up in chunks. That will reduce the amount of queries you'll have to do per loop iteration.

Upvotes: 1

Related Questions