Praful Bagai
Praful Bagai

Reputation: 17382

Python - Locking a shared resource (Multiprocessing)

I want to send a trigger to to start the second process. Its like only after a certain time is passed, then only second process can start. Similar to locking a shared resource. How would I do it?

Here's my sample code :-

def worker():
    for i in range(1,10):
        if i == 5:
            # send a trigger to start the next event. 
            # Its like 'locking' a shared resource.

def main():
    for i in range(1,100):
        d = multiprocessing.Process(target = worker, args = ())
        d.daemon = True
        d.start()

Edited (Expected Output)

Process1 loop1
Process1 loop2
Process1 loop3
Process1 loop4
1 2014-08-27 11:45:51.687848 # after this random numbers should get print.
Process2 loop1
Process2 loop2
Process2 loop3
Process2 loop4
Process1 loop5
Process1 loop6
Process1 loop7
Process1 loop8
Process1 loop9
2 2014-08-27 11:45:51.690052
Process2 loop5
Process2 loop6
Process2 loop7
Process2 loop8
Process2 loop9

Current OUTPUT

Process1 loop1
Process1 loop2
Process1 loop3
Process1 loop4
1 2014-08-27 11:45:51.687848
Process1 loop5
Process1 loop6
Process1 loop7
Process1 loop8
Process1 loop9
Process2 loop1
Process2 loop2
Process2 loop3
Process2 loop4
2 2014-08-27 11:45:51.690052
Process2 loop5
Process2 loop6
Process2 loop7
Process2 loop8
Process2 loop9

Upvotes: 0

Views: 1078

Answers (1)

Anthony Kong
Anthony Kong

Reputation: 40634

I usually prefer to control processes by Queue. It gives you more flexibility because the process can perform different action based on commands in the queue.

import datetime
import multiprocessing
from multiprocessing import Queue

def worker(work_queue):
    if work_queue.get() == "Start":
            for i in range(1,10):
                if i == 5:
                    # do something
                    print datetime.datetime.now() 

def main():
    worker_queues = {}
    for i in xrange(1, 6):
        q = Queue()
        worker_queues[i] = q # one queue per process here
        d = multiprocessing.Process(target = worker, args = (q,))
        d.daemon = True
        d.start()
    for wq in worker_queues.values():
        wq.put("Start")

if __name__ == "__main__":
    main()

EDIT: In response to your edited question, you can enforce execution order by adding a done queue to the above solution

import datetime
import multiprocessing
from multiprocessing import Queue

def worker(myid, work_queue, done_queue):
    if work_queue.get() == "Start":
        for i in range(1,10):
                if i == 5:
                    # do something
                    print myid, datetime.datetime.now() 
        done_queue.put(myid)

def main():
    worker_queues = {}
    for i in xrange(1, 6): 
        q = Queue()
        done_q = Queue()
        worker_queues[i] = (q, done_q)
        d = multiprocessing.Process(target = worker, args = (i, q, done_q))
        d.daemon = True
        d.start()


    for i in xrange(1, 6): 
        worker_queues[i][0].put("Start")
        if worker_queues[i][1].get():
            # move to next iteration
            pass

if __name__ == "__main__":
    main() 

You can understand this solution as a kind of message passing mechanism between processes.

Upvotes: 1

Related Questions