Pythonsnake99
Pythonsnake99

Reputation: 31

Optimise network bound multiprocessing code

I have a function I'm calling with multiprocessing.Pool

Like this:

from multiprocessing import Pool

def ingest_item(id):
    # goes and does alot of network calls
    # adds a bunch to a remote db
    return None

if __name__ == '__main__':
    p = Pool(12)
    thing_ids = range(1000000)
    p.map(ingest_item, thing_ids)

The list pool.map is iterating over contains around 1 million items, for each ingest_item() call it will go and call 3rd party services and add data to a remote Postgresql database.

On a 12 core machine this processes ~1,000 pool.map items in 24 hours. CPU and RAM usage is low.

How can I make this faster?

Would switching to Threads make sense as the bottleneck seems to be network calls?

Thanks in advance!

Upvotes: 3

Views: 529

Answers (2)

NitrogenReaction
NitrogenReaction

Reputation: 336

It is unlikely that using threads will substantially improve performance. This is because no matter how much you break up the task all requests have to go through the network.

To improve performance you might want to see if the 3rd party services have some kind of bulk request API with better performance.

If your workload permits it you could attempt to use some kind of caching. However, from your explanation of the task it sounds like that would have little effect since you're primarily sending data, not requesting it. You could also consider caching open connections (If you aren't already doing so), this helps avoid the very slow TCP handshake. This type of caching is often used in web browsers (Eg. Chrome).

Disclaimer: I have no Python experience

Upvotes: 1

Neal Ehardt
Neal Ehardt

Reputation: 10954

First: remember that you are performing a network task. You should expect your CPU and RAM usage to be low, because the network is orders of magnitude slower than your 12-core machine.

That said, it's wasteful to have one process per request. If you start experiencing issues from starting too many processes, you might try pycurl, as suggested here Library or tool to download multiple files in parallel

This pycurl example looks very similar to your task https://github.com/pycurl/pycurl/blob/master/examples/retriever-multi.py

Upvotes: 2

Related Questions