Reputation: 801
I am trying to acheive multi processing in python. I might have a minimum of 500 elements in the list at least. I have a function to which each element of a list needs to be passed as an argument. Then each of this function should be executed as a seperate process using mutli processing either starting a new interpretter or however. Following is some pseudo code.
def fiction(arrayElement)
perform some operations here
arrayList[]
for eachElement in arrayList:
fiction(eachElement)
I want multiprocess the function under
for eachElement in arrayList:
So that I can use the multiple cores of my box. All the help is appreciated.
Upvotes: 0
Views: 6012
Reputation: 2618
The multiprocessing module contains all sorts of basic classes which can be helpful for this:
from multiprocessing import Pool
def f(x):
return x*x
p = Pool(5)
p.map(f, [1,2,3])
And the work will be distributed among 3 processes.
This is fairly simple, but you can achieve much more using an external packages, mostly a Message-oriented middleware.
Prime examples are ActiveMQ, RabbitMQ and ZeroMQ.
RabbitMQ has a combination of good python API and simplicity. You can see here how simple it is to create a dispatcher-workers pattern, in which one process is sending the workload, and other processes preform it.
ZeroMQ is a bit more low-level, but is very lightweight and does not require an extenal broker.
Upvotes: 4