Reputation: 71
I'm trying to simulate a producer-consumer design in Python 3 multiprocessing. The main problem is that the producer starts but the consumer doesn't start until the producer finishes (in this scenario, the consumer doesn't start because the producer never ends).
Here is the code:
#!/usr/bin/python3
from scapy.all import *
from queue import Queue
from multiprocessing import Process
queue = Queue()
class Producer(Process):
def run(self):
global queue
print("Starting producer thread")
sniff(iface="wlan1mon", store=0, prn=pkt_callback)
def pkt_callback(pkt):
queue.put(pkt)
print(queue.qsize())
class Consumer(Process):
def run(self):
global queue
while True:
pkt = queue.get()
queue.task_done()
if pkt.haslayer(Dot11):
print("**Packet with Dot11 layer has been processed")
else:
print("--Packet without Dot11 layer has been processed")
if __name__ == '__main__':
Producer().start()
Consumer().start()
I don't know what's wrong in my code. I test it using multithreading and it works, so I guess there's something I misunderstood about multiprocessing.
Thank you.
Upvotes: 1
Views: 91
Reputation: 1929
I'm not sure your queue is a shared memory object. I think your producer is writing to a queue in its memory and your consumer is reading from a queue in its memory, but they aren't the same memory so they aren't talking to each other. I think you need a "manager" wrapped around it. see the docs. https://docs.python.org/2/library/multiprocessing.html#sharing-state-between-processes
Or use the multiprocessing version of a Queue. Again from the docs: https://docs.python.org/2/library/multiprocessing.html#exchanging-objects-between-processes
Upvotes: 1