Reputation: 12752
I have a process that creates a new batch of jobs at a fixed interval (every minute) and I want to send them to kue for processing by another process.
Sometimes, the same job can be in different batches.
What happens if a job that was sent in a previous batch wasn't completed by the time it is sent again in a new batch ?
My understanding is the it will be treated as a new job and executed twice.
Is this correct, and is there a way to avoid this ?
Upvotes: 2
Views: 742
Reputation: 1545
One thing would be to trap the job complete
event and traverse the list of queued job (doing as explained in this excellent post)to remove a possible duplicate assuming that you can identify it.
I have never done that myself and, if you follow this route, be wary of race conditions: I wonder if it is possible that the duplicate job might be scheduled before you finish traversing the pending jobs (I do not know).
Hope this helps.
Upvotes: 1