Aman Gupta
Aman Gupta

Reputation: 3797

Python: multiprocessing and requests

Following is the snippet of code I am running to use multiprocessing which fires HTTP request in parallel. After the running on console it is getting hung at "requests.get(url)" and neither proceeding ahead nor throwing an error.

def echo_100(q):
    ...
    print "before"
    r = requests.get(url)
    print "after"
    ...
    q.put(r)

q = multiprocessing.Queue()
p = multiprocessing.Process(target=echo_100,args=(q))
p.start()
p.join()
resp = q.get()

Upvotes: 7

Views: 3230

Answers (3)

Kevin Ferguson
Kevin Ferguson

Reputation: 81

On Mac OS, there seem to be some bugs reading proxy settings from the operating system. I don't know the exact details, but it sometimes causes requests to hang when using multiprocessing. You could try to circumvent the problem by disabling OS proxies entirely, like this:

session = requests.Session()
session.trust_env = False  # Don't read proxy settings from OS
r = session.get(url)

That fixed it for me.

Upvotes: 8

kilgoretrout
kilgoretrout

Reputation: 3657

I had the exact same problem a project. I found that removing import ipdb calls in all my modules resolved the issue. I'm not sure why that import was causing this problem but eliminating those imports totally fixed it. Just having the import alone caused the problem, I wasn't even using anything from the ipdb package.

UPDATE: This happens on both Python 2.7.10 and 3.5.0 and only when I import ipdb; everything is fine if you import pdb. I've posted a related question to ask why this is happening here

Hope this resolves your issue too.

Upvotes: 0

Nipun Talukdar
Nipun Talukdar

Reputation: 5387

If you don't clean up the queue, I mean don't take items out of the queue, the process will hang after some time. The multiprocessing queues are backed by unnamed FIFOs (pipes) on Linux platforms. There is a maximum size limit for the pipes. That means, if a process writing to the pipe and no other process is reading data from pipe, after some time the writing process will hang while trying to put more data to the pipe (it may be hanging on the write system call internally).

I suspect, you are not getting item from the queue, hence the queue has become full after some time, resulting in subsequent child processes getting stalled.

Now, if the child process hangs, then the parent process may also get hung if it is trying to join (p.join()) the child (internally it will call waitpid to join the child process).

Upvotes: 0

Related Questions