Reputation: 465
Lets say I have a queue with a bunch of messages in it. I have 2 consumers connected to that queue, both set with a prefetch = 1. The work that these consumers do takes some time, and I don't want to acknowledge the message until the work is done (in case the consumer crashes or something - I want the message to automatically reenter the queue in exceptional cases).
But I also want these consumers to work in parallel, and that doesn't appear to be happening. In other words, as long as there are 2+ messages in the queue, I'd expect both consumers to be busy.
What appears to be happening instead is that consumer 1 receives a message, but consumer 2 will wait until consumer 1 has acknowledged the message. Then consumer 2 receives a message and consumer 1 waits, etc.
Is there an option I'm missing? Or should this be working, I just have a bug in my code somewhere? Or is this not possible?
Upvotes: 1
Views: 780
Reputation: 94951
You should be able to pull messages off the queue while previous messages are still being processed by other consumers. The RabbitMQ tutorial specifically points to parallelism as a strength of round-robin dispatching (http://www.rabbitmq.com/tutorials/tutorial-two-python.html). Are your two consumers running as threads in the same process? I wonder if you've just made a mistake in the implementation.
Upvotes: 1