John28
John28

Reputation: 773

Python Multiprocessing Pool.map

I try to read files with multiprocessing in python. Here is a small example:

import multiprocessing
from time import *

class class1():
    def function(self, datasheetname):
        #here i start reading my datasheet

if __name__ == '__main__':
    #Test with multiprosessing
    pool = multiprocessing.Pool(processes=4)
    pool.map(class1("Datasheetname"))
    pool.close()

Now I get the following error:

TypeError: map() missing 1 required positional argument: 'iterable'

In another thread in this board I got the hint to do this with ThreadPool, but I don't know how to do that. Any ideas?

Upvotes: 3

Views: 8184

Answers (1)

sirfz
sirfz

Reputation: 4277

Pool.map:

map(func, iterable[, chunksize])

A parallel equivalent of the map() built-in function (it supports only one iterable argument though). It blocks until the result is ready.

This method chops the iterable into a number of chunks which it submits to the process pool as separate tasks. The (approximate) size of these chunks can be specified by setting chunksize to a positive integer.

You need to pass an iterable of which each element is passed to the target func as an argument in each process.

Example:

def function(sheet):
    # do something with sheet
    return "foo"

pool = Pool(processes=4)
result = pool.map(function, ['sheet1', 'sheet2', 'sheet3', 'sheet4'])
# result will be ['foo', 'foo', 'foo', 'foo']

Upvotes: 5

Related Questions