Reputation: 186
I have an app that processes product and order updates and use a mixture of events and jobs to respond to changes in either.
Having worked on this for almost 2 years and watching it in action Ive found the biggest bottleneck is when duplicate events (which can then lead to duplicate jobs etc.) are sent by any of the other interconnected systems.
Under normal operation this is fine, we have plenty of slack to accommodate a spike but when it gets really busy it can snowball and lead to backlog of unprocessed events and jobs.
The setup Im using is a standard install of Laravel 4.2 with MySQL (mariadb) a redis backed queue for jobs and events with multiple queues to handle different kinds of events (product stock updates, order dispatch etc).
I was wondering if anyone could suggest a way to prevent laravel from adding events and/or jobs that relate to the same action.
An example of a job would be :
{
"job": "Illuminate\\Queue\\CallQueuedHandler@call",
"data": {
"commandName": "Staw\\Magento\\Jobs\\AddTrackingNumber",
"command": "O:40:\"Staw\\Magento\\Jobs\\AddTrackingNumber\":6:{s:17:\"\u0000*\u0000trackingNumber\";s:14:\"[Tracking Number]\";s:14:\"\u0000*\u0000orderNumber\";s:19:\"[Order NUmber]\";s:10:\"connection\";N;s:5:\"queue\";s:8:\"despatch\";s:5:\"delay\";N;s:6:\"\u0000*\u0000job\";N;}"
},
"id": "xGmsi6vo458cYGeJHDfpGCZG8QuAJXB7",
"attempts": 1
}
And an event listener (from event)
{
"job": "Illuminate\\Events\\CallQueuedHandler@call",
"data": {
"class": "Staw\\Magento\\Listeners\\OrderEventListener",
"method": "onOrderChangeAlert",
"data": "a:1:{i:0;O:49:\"Staw\\HiveEvent\\Events\\Order\\OrderChangeAlert\":6:{s:10:\"eventstart\";O:13:\"Carbon\\Carbon\":3:{s:4:\"date\";s:26:\"2017-12-14 11:20:04.000000\";s:13:\"timezone_type\";i:3;s:8:\"timezone\";s:13:\"Europe/London\";}s:8:\"chain_id\";s:13:\"5a325e64510ce\";s:10:\"channel_id\";i:1;s:11:\"channel_key\";N;s:9:\"entity_id\";s:19:\"[Order Number]\";s:5:\"queue\";s:5:\"event\";}}"
},
"id": "0GLUt1O9y6jaZ56yHKA8Vn9si1QNnmlf",
"attempts": 1
}
I'm aware that I can search redis using the predis/predis package but this seems really cumbersome, I've considered making my own queue/stack perhaps on the DB to have more control of things but this seems like a big task and feel I'd be missing out on some of the features I'd get from using the stock Laravel setup.
Upvotes: 3
Views: 2662
Reputation: 5129
The workaround is, it may involve the application structure changed. You should make a table to log the job information such as the job id, the job name, the job data and etc before push into queue. When the queue worker handling the job, the job handler should look up the table and see if there was duplicating job. If so, terminate the job immediately. It should do the same to the event.
There are numerous implementations:
For Laravel 4, modify the fire
event of all job handlers, or inherit an abstract class, that logs and lookups if this tuple($job, $data)
already exists. If already exists, terminate the job immediately.
For laravel 5, there is a handy way to implement. We can use Queues Job Events, Queue::before
to log the job. Modify each job handlers to lookup if the duplicating job already exists. If already exists, terminate the job immediately.
Upvotes: 2