R3uben
R3uben

Reputation: 125

Python multiprocessing without blocking parent process

I am attempting to create a simple application which continuously monitors an inbox, then calls various functions as child processes, after categorising incoming mail.

I would like the parent process to continue it's while loop without waiting for the child process to complete. For example:

def main():
    while 1:
        checkForMail()
        if mail:
            if mail['type'] = 'Type1':
                process1() # 
                '''
                spawn process1, as long as no other process1 process running,
                however it's fine for a process2 to be currently running
                '''
            elif mail['type'] = 'Type2':
                process2()
                '''
                spawn process2, as long as no other process2 process running,
                however it's fine for a process1 to be currently running
                '''

        # Wait a bit, then continue loop regardless of whether child processes have finished or not
        time.sleep(10)
if __name__ == '__main__':
    main()

As commented above, there should never be more than once concurrent child process instance for a function, however processes can run concurrently if they are running different functions.

Is this possible to do with the multiprocessing package?

Upvotes: 0

Views: 1034

Answers (2)

R3uben
R3uben

Reputation: 125

Following on from pdeubel's answer which was very helpful, the completed skeleton script is as follows:

So start the two Processes before the main loop, then start the main loop and the mails should get put on the Queues where they get picked up in the subprocesses.

def func1(todo):
    # do stuff with current todo item from queue1

def func2(todo):
    # do stuff with current todo item from queue2

def listenQ1(q):
    while 1:
        # Fetch jobs from queue1
        todo = q.get()
        func1(todo)
def listenQ2(q):
    while 1:
        # Fetch jobs from queue2
        todo = q.get()
        func2(todo)

def main(queue1, queue2):
    while 1:
        checkForMail()
        if mail:
            if mail['type'] = 'Type1':
                # Add to queue1
                queue1.put('do q1 stuff')

            elif mail['type'] = 'Type2':
                # Add job to queue2
                queue2.put('do q2 stuff')
    time.sleep(10)

if __name__ == '__main__':
    # Create 2 multiprocessing queues
    queue1 = Queue()
    queue2 = Queue()

    # Create and start two new processes, with seperate targets and queues
    p1 = Process(target=listenQ1, args=(queue1,))
    p1.start()
    p2 = Process(target=listenQ2, args=(queue2,))
    p2.start()

    # Start main while loop and check for mail
    main(queue1, queue2)

    p1.join()
    p2.join()

Upvotes: 2

pdeubel
pdeubel

Reputation: 26

You could use two Queues, one for mails of Type1 and one for mails of Type2 and two Processes again one for mails of Type1 and one for mails of Type2.

Start by creating these Queues. Then create the Processes and give the first Queue to the first Process and the second Queue to the second Process. Both Process objects need a parameter target which is the function that the Process executes. Depending on the logic you probably will need two functions (again one for each type). Inside the function you want something like an infinite loop which takes items from the Queue (i.e. the mails) and then act on them according to your logic. The main function would also consist of an infinite loop where the mails are retrieved and depending on their type they get placed on the correct Queue.

So start the two Processes before the main loop, then start the main loop and the mails should get put on the Queues where they get picked up in the subprocesses.

Upvotes: 1

Related Questions