Shaun
Shaun

Reputation: 311

Running a function in each iteration of a loop as a new process in python

I have this:

from multiprocessing import Pool

pool = Pool(processes=4)

def createResults(uniqPath):
    *(there is some code here that populates a list - among other things)*

for uniqPath in uniqPaths:
    pool.map(createResults, uniqPath)

pool.close()
pool.join()

I don't know if it's possible, but can I run the createResults function that gets called in that loop as a new process for each iteration?

I'm populating a list using a 4 million line file and it's taking 24+ hours to run. (Obviously the code above does not work)

Thanks!

Upvotes: 2

Views: 2181

Answers (1)

Bharel
Bharel

Reputation: 26991

Instead of:

for uniqPath in uniqPaths:
    pool.map(createResults, uniqPath)

Do this:

pool.map(createResults, uniqPaths)

You must use map on the iterable itself in order to run in a concurrent fashion.
Keep in mind though - Populating a list means the list won't be shared between the processes, and if it does using Array(), make sure it's process-safe.

Upvotes: 1

Related Questions