Reputation: 126
Have a laravel project and code block like below
$pendingToProcessFiles = PendingToProcessFile::where('tries', '<', 3)->orderBy('created_at')->get();
Log::channel('upload-to-s3-jobs-log')->info("Using queue {$this->queue}");
foreach ($pendingToProcessFiles as $index => $pendingToProcessFile) {
$jobObject = new UploadToS3Job($pendingToProcessFile);
dispatch($jobObject->onQueue($this->queue));
}
This works as expected in my local machine (laradock), and doesn't matter if I have a lot of records in table, that belongs to PendingToProcessFile
model.
Problem is happening in elastic beanstalk instance, where not all records are processing. It can process 10 files from 100 records, or 80 files from 1000 records (and not in order that results appear on collection after eloquent query) each time , when cron job is triggering this code block.
Tried to add some logging using supervisor configs like
[program:upload-to-s3]
process_name=%(program_name)s_%(process_num)02d
command=php artisan queue:work sqs --queue=upload-to-s3 --timeout=600 --sleep=3 --tries=3
directory=/var/app/current
autostart=true
autorestart=true
user=root
numprocs=4
redirect_stderr=true
stdout_logfile=/etc/supervisord.d/output.log
stderr_logfile=/etc/supervisord.d/error.log
But nothing added in those log files (log file is there, I have checked), so I have no idea what's going on here...
Btw, upload-to-s3
is standard sqs queue.
If someone can give me any advices regarding this, will be appreciated!
Upvotes: 1
Views: 337
Reputation: 126
For everyone, who is facing an issues like that, make sure, you are using different sqs instance regions for different environments in aws. For this case issue was, that elastic beanstalk development and production instances were using same sqs instance in same region, and queues let's say were "sharing" between environment and that was making an impression, that job wasn't pushed to queue.
Upvotes: 1