shreyas
shreyas

Reputation: 2630

parallelly execute blocking calls in python

I need to do a blocking xmlrpc call from my python script to several physical server simultaneously and perform actions based on response from each server independently. To explain in detail let us assume following pseudo code

while True:
    response=call_to_server1() #blocking and takes very long time
    if response==this:
        do that

I want to do this for all the servers simultaneously and independently but from same script

Upvotes: 0

Views: 162

Answers (5)

warvariuc
warvariuc

Reputation: 59674

Use twisted.

It has a lot of useful stuff for work with network. It is also very good at working asynchronously.

Upvotes: 0

Cinquo
Cinquo

Reputation: 663

You can use multiprocessing plus queues. With one single sub-process this is the example:

import multiprocessing
import time

def processWorker(input, result):
    def remoteRequest( params ):
        ## this is my remote request
        return True
    while True:
        work = input.get()
        if 'STOP' in work:
            break
        result.put( remoteRequest(work) )

input  = multiprocessing.Queue()
result = multiprocessing.Queue()

p = multiprocessing.Process(target = processWorker, args = (input, result))
p.start()
requestlist = ['1', '2']
for req in requestlist:
    input.put(req)
for i in xrange(len(requestlist)):
    res = result.get(block = True)
    print 'retrieved ', res

input.put('STOP')
time.sleep(1)
print 'done'

To have more the one sub-process simply use a list object to store all the sub-processes you start. The multiprocessing queue is a safe object.

Then you may keep track of which request is being executed by each sub-process simply storing the request associated to a workid (the workid can be a counter incremented when the queue get filled with new work). Usage of multiprocessing.Queue is robust since you do not need to rely on stdout/err parsing and you also avoid related limitation.

Then, you can also set a timeout on how long you want a get call to wait at max, eg:

import Queue
try:
    res = result.get(block = True, timeout = 10)
except Queue.Empty:
    print error

Upvotes: 0

renenglish
renenglish

Reputation: 728

You can use multiprocessing module

import multiprocessing
def call_to_server(ip,port):
....
....
for i in xrange(server_count):
    process.append( multiprocessing.Process(target=call_to_server,args=(ip,port)))
    process[i].start()
#waiting process to stop
for p in process:
    p.join()

Upvotes: 0

Nix
Nix

Reputation: 58562

Boilerplate threading code (I can tailor this if you give me a little more detail on what you are trying to accomplish)

def run_me(func):
    while not stop_event.isSet():
      response= func()  #blocking and takes very long time
      if response==this:
         do that

def call_to_server1():
     #code to call server 1...
     return  magic_server1_call()

def call_to_server2():
     #code to call server 2...
     return  magic_server2_call()


#used to stop your loop.   
stop_event = threading.Event()

t = threading.Thread(target=run_me, args=(call_to_server1))
t.start()

t2 = threading.Thread(target=run_me, args=(call_to_server2))
t2.start()

#wait for threads to return.
t.join()
t2.join()

#we are done....

Upvotes: 1

Marcelo Cantos
Marcelo Cantos

Reputation: 185970

Use the threading module.

Upvotes: 1

Related Questions