Reputation: 5655
I was thinking about adding multiprocessing to one of my scripts to increase performance.
It's nothing fancy, there are 1-2 parameters to the main method.
Is there anything wrong with just running four of the same scripts cloned on the terminal versus actually adding multiprocessing into the python code?
ex for four cores:
~$ script.py &
script.py &
script.py &
script.py;
I've read that linux/unix os's automatically divide the programs amongst the available cores.
Sorry if ^ stuff i've mentioned above is totally wrong. Nothing above was formally learned, it was just all online stuff.
Upvotes: 5
Views: 1930
Reputation: 9696
Using multiprocessing
might also become the better solution if you'd want to run your script, say, 100 times on only 4 cores.
Then your terminal-based approach would become pretty nasty.
In this case you might want to use a Pool
from the multiprocessing
module.
Upvotes: 1
Reputation: 4928
Martijn Pieters' comment hits the nail on the head, I think. If each of your processes only consumes a small amount of memory (so that you can easily have all four running in parallel without running out of RAM) and if your processes do not need to communicate with each other, it is easiest to start all four processes independently, from the shell, as you suggested.
The python multiprocessing
module is very useful if you have slightly more complex requirements. You might, for instance, have a program that needs to run in serial at startup, then spawn several copies for the more computationally intensive section, and finally do some post-processing in serial. multiprocessing
would be invaluable for this sort of synchronization.
Alternatively, your program might require a large amount of memory (maybe to store a large matrix in scientific computing, or a large database in web programming). multiprocessing
lets you share that object among the different processes so that you don't have n copies of the data in memory (using multiprocessing.Value
and multiprocessing.Array
objects).
Upvotes: 6