Reputation: 3261
Question: How to customize long-running job without attempting multiple time after each retry_after seconds?
I have one job which will take 1 to 3 hours to run, I already created job-based on laravel documentation, here is my job file.
<?php
namespace App\Modules\Csv\Jobs;
use App\Jobs\Job;
use Illuminate\Bus\Queueable;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Foundation\Bus\Dispatchable;
use Illuminate\Queue\InteractsWithQueue;
use Illuminate\Queue\SerializesModels;
use Illuminate\Support\Facades\DB;
use Illuminate\Support\Facades\Storage;
use Illuminate\Support\Str;
use League\Csv\Reader;
use Phone;
/**
* A single excel import job, which can be pushed on to a queue
*/
class UploadCsvDataInTable extends Job implements ShouldQueue
{
use InteractsWithQueue, SerializesModels, Dispatchable,Queueable;
public $timeout = 172800;
/**
* The excel to import
*
* @var App\BulkUpload
*/
protected $csvUpload;
/**
* Create a new job instance.
*
* @param App\FeedImport
*
* @return void
*/
public function __construct(CsvUpload $csvUpload)
{
$this->csvUpload = $csvUpload;
}
public function handle()
{
app(CsvUploadService::class)->uploadCsv($this->csvUpload);
}
}
here is Laravel document to specify a timeout for jobs.
here is code for how I am calling that job.
UploadCsvDataInTable::dispatch($csvUpload)->onConnection('redis')->onQueue('low');
my command for queue:work In supervisor.
php artisan queue:work --queue=high,low,default --sleep=3 --tries=3
here is my configuration for queue & horizon
// horizon.php
'production' => [
'supervisor-1' => [
'connection' => 'redis',
'queue' => ['high', 'default', 'low'],
'balance' => 'simple',
'processes' => 6,
'tries' => 3,
],
],
//queue.php
'redis' => [
'driver' => 'redis',
'connection' => 'default',
'queue' => env('REDIS_QUEUE', 'default'),
'retry_after' => 90,
'block_for' => null,
],
I can see because of retry_after my job is attempting multiple time and after reaching 3 try as per horizon configuration its throwing MaxAttemptsExceededException.
if I increase $timeout to 24 hours I am getting duplicate records in my db as retry_after is attempting that job multiple time.
is there any way I can set custom retry_after for this job?
Upvotes: 6
Views: 10085
Reputation: 41330
you don't need to set retry_after
, you need to set tries to 1
public $tries = 1;
https://laravel.com/docs/8.x/queues#max-attempts
Upvotes: 4
Reputation: 3261
I have created another connection for long-running jobs and its working properly for me.
created new supervisor connection in horizon.php for long running process
'supervisor-long-running' => [
'connection' => 'redis-long-processes',
'queue' => 'long-running',
'balance' => 'simple',
'processes' => 3,
'tries' => 1,
'timeout' => 86000 // should be shorter than retry_after out
]
and new redis connection in queue.php
'redis-long-processes' => [
'driver' => 'redis',
'connection' => 'default',
'queue' => 'long-running',
'retry_after' => 86400,
'block_for' => null,
],
in database.php added new queue for long running jobs.
'queue' => [
[
'connection' => 'redis',
'queue' => ['high', 'default', 'low','long-running'],
'balance' => 'simple',
'processes' => 6,
'tries' => 3,
'url' => env('REDIS_URL'),
'host' => env('REDIS_HOST', '127.0.0.1'),
'password' => env('REDIS_PASSWORD', null),
'port' => env('REDIS_PORT', '6379'),
'database' => env('REDIS_CACHE_DB', '1'),
],
],
also don't forgot to call jobs using onConnection and onQueue to specify from which queue jobs should execute.
UploadDataInTable::dispatch($upload)->onConnection('redis-long-processes')->onQueue('long-running');
Upvotes: 8