ffuentes
ffuentes

Reputation: 1182

Laravel queue process gets timeout

I'm running a command that should run a Laravel Excel import but it gets exhausted after a while. I'm using chunks and it worked before but now I'm struggling to make it work. It's a group of files that are located in a folder within the filesystem.

This is the command that I run with artisan:

public function handle()
    {
        //
        $directory = 'pv';
        $files = Storage::allFiles($directory);
        \Log::info('Process started.');
        $start = microtime(true);
        ini_set('max_execution_time', 600);
        foreach($files as $file)
        {
            $fname = basename($file);
            \Log::info('Processing',[$fname]);
            $arr = explode(" ", $fname);
            $day = substr($arr[2], 0, 10);
            $date = Carbon::parse($day);
            Excel::queueImport(new POSImport($date), $file);
        }
        $time = microtime(true) - $start;
        $me = '[email protected]';
        $msg = 'Process finished in '. $time.' secs.';
        Mail::to($me)->queue(new TasksFinished($msg));
        $this->call('calcular:previos', [
        '--queue' => 'default'
        ]);
    }

It gets out of memory.

This is the import.

<?php

namespace App\Imports;

use Illuminate\Support\Collection;
use Maatwebsite\Excel\Concerns\ToCollection;
use Maatwebsite\Excel\Concerns\WithHeadingRow;
use Maatwebsite\Excel\Concerns\WithChunkReading;
use Illuminate\Contracts\Queue\ShouldQueue;
use App\Pos;
use App\Device;
use \Datetime;

class POSImport implements ToCollection, WithHeadingRow, WithChunkReading, ShouldQueue
{


   public $tries = 3;

   function __construct(Datetime $date) {
       $this->date = $date;
   }

   /**
    * Importa datos de la planilla de puntos vigentes de Banco Estado.
     * 
     * Actúa sobre Device (equipos) y POS 
     * 
    * @param Collection $rows
    */

    public function collection(Collection $rows)
    {
        //
        ini_set('max_execution_time', 600);
        foreach($rows as $row)
        {
            // crea o modifica POS

            if(!isset($row['marca'])) {
                return null;
            }
            // Busca el POS (lugar) en la base
            $pos = Pos::where('id', $row['pos'])->first();
            // si no hay un "pos" registrado lo crea
            if(!$pos) {
                $pos = new Pos;
                $pos->id = $row['pos'];
                $pos->vigente = ($row['estado'] == 'VIGENTE' ? true : false);
                $pos->save();
            } else {
                $pos->vigente = ($row['estado'] == 'VIGENTE' ? true : false);
                $pos->save();
            }
            // limpia serial de ceros a la izquierda
            $serial = ltrim($row['serie_equipo'], '0');
            // busca serial en la base de datos
            $device = Device::where('serial', $serial)
                    ->where('fecha_recepcion', '<', $this->date)
                    ->where('customer_id', 1)
                    ->orderBy('fecha_recepcion', 'asc')
                    ->first();

            if($device && $device->pos_id != $row['pos'] && $device->fecha_instalacion != $this->date){
                // busca el dispositivo anterior

                $device->pos_id = $pos->id;
                $device->fecha_instalacion = $this->date;
                $device->save();

                $device->pos()->attach($pos);
            } 
        }

    }

    public function chunkSize(): int {
        return 2000;
    }

}

As you can see I'm using WithChunkReading and ShouldQueue. When I started this process in the past it just processed the chunks but now the queue shows lots of QueueImport entries.

I'm using the database as the queue driver.

I hope you can help me out with this.

Error in the command:

Symfony\Component\Debug\Exception\FatalErrorException  : Allowed memory size of 536870912 bytes exhausted (tried to allocate 175747072 bytes)

  at C:\laragon\www\reportes\vendor\laravel\framework\src\Illuminate\Queue\Queue.php:138
    134|
    135|         return array_merge($payload, [
    136|             'data' => [
    137|                 'commandName' => get_class($job),
  > 138|                 'command' => serialize(clone $job),
    139|             ],
    140|         ]);
    141|     }
    142|


   Whoops\Exception\ErrorException  : Allowed memory size of 536870912 bytes exhausted (tried to allocate 175747072 bytes)

  at C:\laragon\www\reportes\vendor\laravel\framework\src\Illuminate\Queue\Queue.php:138
    134|
    135|         return array_merge($payload, [
    136|             'data' => [
    137|                 'commandName' => get_class($job),
  > 138|                 'command' => serialize(clone $job),
    139|             ],
    140|         ]);
    141|     }
    142|

It's a lot of data, that's why I'm using chunks and queues but I still have this problem.

Upvotes: 0

Views: 5346

Answers (1)

senty
senty

Reputation: 12847

class POSImport implements ShouldQueue
{
    /**
     * The number of seconds the job can run before timing out.
     *
     * @var int
     */
    public $timeout = 120;
}

Also, if you want your queue worker to increase the timeout, you can use a --timeout flag (I think default one is 30 seconds):

php artisan queue:work --timeout=300


I am not sure about this, but also may work:

$this->call('calcular:previos', [
    '--queue' => 'default',
    '--timeout' => '300'
]);

Upvotes: 2

Related Questions