Reputation: 21
I'm using Symfony 3.4 and Doctrine. I need to update large amount of entities (300k+) using Doctrine. I've read batch article form Doctrine docs and I've read topics from stack, but problem is despite size of the batch (20, 100, 200, 500) I'm getting 'out of memory' error anyway when I'm approaching 20k proccessed entities.
Here is my function.
Can someone, please, give me a hint/suggestion how to avoid this?
protected function execute(InputInterface $input, OutputInterface $output): void
{
$io = new SymfonyStyle($input, $output);
$em = $this->getContainer()->get('doctrine.orm.entity_manager');
$em->getConfiguration()->setSQLLogger(null);
$repository = $em->getRepository('AppBundle:Order');
$qb = $repository->createQueryBuilder('o');
$totalCount = (int) $qb->select($qb->expr()->count('o'))
->where($qb->expr()->eq('o.amountOut', 0))
->getQuery()
->getSingleScalarResult();
$progressBar = $io->createProgressBar($totalCount);
$query = $qb->select('o')
->where($qb->expr()->eq('o.amountOut', 0))
->getQuery();
$iterableResult = $query->iterate();
$batchSize = 100;
$i = 0;
foreach ($iterableResult as $row) {
/** @var Order $order */
$order = $row[0];
$commissionsArr = $this->calcCommissionInOutFromOrder($order);
$amountOut = $order->getTransferAmount();
$order->setAmountOut($amountOut);
$order->setCommissionIn($commissionsArr['commission_in']);
$order->setCommissionOut($commissionsArr['commission_out']);
$em->persist($order);
$progressBar->advance();
if (0 === ($i % $batchSize)) {
$em->flush();
$em->clear();
}
++$i;
}
$em->flush();
$io->success('Suckess');
}
Upvotes: 1
Views: 1409
Reputation: 21
Found actual answer in Memory leak when executing Doctrine query in loop.
Quoting: "I resolved this by adding --no-debug to my command. It turns out that in debug mode, the profiler was storing information about every single query in memory."
It actually worked. Using memory_get_usage() I've checked it.
Upvotes: 1