rajpreet singh
rajpreet singh

Reputation: 41

Run functions parallel in python 2.7 to use output of an function at the end of other functions

I am a newbie to python and never used it's parallel processing modules like threading or multiprocess. I am working on a real time code problem where the output of one function is used as the input of one function. There is one big function which takes almost 3 seconds to complete. It is like a program where a person submit some documents to reception, and while his documents are being verified he is directed to somewhere other for different checks. If at the end of these checks the result of documents verification is available then the program will be failed.

def parallel_running_function(*args):
     """It is the function which will take 3 seconds to complete"""
     output = "various documents matching and verification"
     return output

def check_1(*args):
    """ check one for the task"""

def check_2(*args):
    """ check two for the task"""

def check_3(*args):
    """ check three for the task"""

def check_4(*args):
    """ check 4 for the task"""


def main_function():

    output = parallel_running_function() # need to run this function 
                                        #parallel with other functions
    output_1 = check_1()
    output_2 = check_2()
    output_3 = check_3()
    output_4 = check_4()
    if output:
       "program is successful"
    else:
        "program is failed"

    I need the output of parallel running function is here along with the other executed functions. If I don't get the output of that function here then program will be failed or ll give some wrong result.

I am using python 2.7. I have read multiple posts about this problem using threading, subprocess and multiprocessing module of python but i couldn't get a concrete solution of this problem. What I got from other posts is seems i need to use multiprocessing module. Can someone please give me an idea about how should i overcome of this problem.

Upvotes: 1

Views: 949

Answers (1)

elveatles
elveatles

Reputation: 2730

You can do something like this:

import multiprocessing

pool = None

def parallel_running_function(*args):
     """It is the function which will take 3 seconds to complete"""
     output = "various documents matching and verification"
     return output

def check_1(*args):
    """ check one for the task"""

def check_2(*args):
    """ check two for the task"""

def check_3(*args):
    """ check three for the task"""

def check_4(*args):
    """ check 4 for the task"""


def main_function():

    res = pool.apply_async(parallel_running_function)

    res_1 = pool.apply_async(check_1)
    res_2 = pool.apply_async(check_2)
    res_3 = pool.apply_async(check_3)
    res_4 = pool.apply_async(check_4)

    output = res.get()
    output_1 = res_1.get()
    output_2 = res_2.get()
    output_3 = res_3.get()
    output_4 = res_4.get()

    if output:
       print "program is successful"
    else:
        print "program is failed"


if __name__ == '__main__':
    pool = multiprocessing.Pool(processes=4)
    main_function()

The main process will block when get is called, but the other processes will still be running.

Upvotes: 1

Related Questions