Reputation: 268
can both consuming and publishing be done in one Python thread using RabbitMQ channels?
Upvotes: 3
Views: 3477
Reputation: 7705
Kombu is a common python library for working with RabbitMQ (Celery uses it under the hood). It is worth pointing out here that the answer to your question for the simplest use of Kombu that I tried is "No - you can't receive and publish on the same consumer callback thread."
Specifically if there are several messages in the queue for a consumer that has registered a callback for that topic and that callback does some processing and publishes the results then the publishing of the result will cause the 2nd message in the queue to hit the callback before it has returned from the publish from 1st message - so you end up with a recursive call to the callback. If you have n message on the queue your call stack will end up n message deep before it unwinds. Obviously that explodes pretty quickly.
One solution (not necessarily the best) is to have the callback just post the message into a simple queue internal to the consumer that could be processed on the main process thread (i.e. off the callback thread)
def process_message(self, body: str, message: Message):
# Queue the message for processing off this thread:
print("Start process_message ----------------")
self.do_process_message(body, message) if self.publish_on_callback else self.queue.put((body, message))#
print("End process_message ------------------")
def do_process_message(self, body: str, message: Message):
# Deserialize and "Process" the message:
print(f"Process message: {body}")
# ... msg processing code...
# Publish a processing output:
processing_output = self.get_processing_output()
print(f"Publishing processing output: {processing_output}")
self.rabbit_msg_transport.publish(Topics.ProcessingOutputs, processing_output)
# Acknowledge the message:
message.ack()
def run_message_loop(self):
while True:
print("Waiting for incoming message")
self.rabbit_connection.drain_events()
while not self.queue.empty():
body, message = self.queue.get(block=False)
self.do_process_message(body, message)
In this snippet above process_message is the callback. If publish_on_callback is True you'll see recursion in the callback n deep for n message on rabbit queue. If publish_on_callback is False it runs correctly without recursion in the callback.
Another approach is to use a second Connection for the Producer Exchange - separate from the Connection used for the Consumer. This also works so that callback from consuming a message and publishing the result completes before the callback is again fired for the next message on queue.
Upvotes: 0
Reputation: 10666
Actually this isn't a problem at all and you can do it quite easily with for example pika
the problem is however that you'd have to stop the consuming since it's a blocking loop or do the producing during the consume of a message.
Consuming and producing is a normal usecase, especially in pika since it isn't threadsafe, when for example you'd want to implement some form of filter on the messages, or, perhaps a smart router, which in turn will pass on the messages to another queue.
Upvotes: 1
Reputation: 7624
I think the simple answer to your question is yes. But it depends on what you want to do. My guess is you have a loop that is consuming from your thread on one channel and after some (small or large) processing it decides to send it on to another queue (or exchange) on a different channel then I do not see any problem with that at all. Though it might be preferable to dispatch it to a different thread it is not necessary.
If you give more details about your process then it might help give a more specific answer.
Upvotes: 0
Reputation: 1002
I'd recommend taking a look at Celery (http://celery.readthedocs.org/en/latest/) to manage worker tasks. With that, you won't need to integrate with RMQ directly as it will handle the the producing and consuming for you.
But, if you do desire to integrate with RMQ directly and manage your own workers, check out Kombu (http://kombu.readthedocs.org/en/latest/) for the integration. There are non-blocking consumers and producers that would permit you to have both in the same event loop.
Upvotes: 0
Reputation: 308763
I don't think you should want to. MQ means asynch processing. Doing both consuming and producing in the same thread defeats the purpose in my opinion.
Upvotes: 0