Reputation: 1
How to spawn multiple parallel external exe instances from function(generator) for x1 in x:
in multiprocessing manner(to keep always one exec running per cpu thread)? If there is no method to do it in current pseudo code below, then what other best/simple solutions would be?
Btw after exec instance exit I'll need to get size of file outputed & delete it. Code purpose is to find desirable combination of x/y/z parameters,
os.system
line syntax is incorrect for better readability. Behind state_x = x1/z1
will be more code like exitcode check, getfilesize and compare, so x1
x2
x3
will be not always passed to variables.
x = list(range(1, 300+1))
y = list(range(1, 300+1))
z = list(range(1, 300+1))
state_x = []
state_y = []
state_z = []
import os
for x1 in x:
for y1 in y:
for z1 in z:
os.system("external.exe -x1 -y1 -z1 outfile_x1_y1_z1.out")
state_x = x1
state_y = y1
state_z = z1
UPDATE1
I simplified code more to be more understandable, replaced os.system("external.exe...
with print
so that from shell output it would be more clear what code does.
Disregard that state_* = []
variables always gets last loop variation from generator, it is just simplified code, and expected result - sign that code works!
Question still same, how to spawn exec/print in multiprocess from looping generator.
x = list(range(1,2+1))
y = list(range(3,4+1))
z = list(range(5,6+1))
state_x = []
state_y = []
state_z = []
import os
for x1 in x:
for y1 in y:
for z1 in z:
print (x1,y1,z1)
state_x = x1
state_y = y1
state_z = z1
Shell output:
==================== RESTART: D:/Python36-32/myscript5.py ==============
1 3 5
1 3 6
1 4 5
1 4 6
2 3 5
2 3 6
2 4 5
2 4 6
>>> state_x
2
>>> state_y
4
>>> state_z
6
>>>
UPDATE2:
This code below starts external exe multiprocessed if run from IDLE, but I don't get variables state_x
state_y
state_z
passed out of function to global variable. After code is finished I type state_x
in Python Shell I get returned it empty []
.
code:
import itertools
import multiprocessing
import os
x = list(range(1,2+1))
y = list(range(3,4+1))
z = list(range(5,6+1))
state_x = []
state_y = []
state_z = []
def do_work(x1, y1, z1):
os.system("ping.exe 127.0.0.1 -n "+str(x1)+"")
global state_x
state_x = x1
global state_y
state_y = y1
global state_z
if __name__ == "__main__":
with multiprocessing.Pool() as pool:
results = pool.starmap(do_work, itertools.product(range(1,3),range(3,5),range(5,7), repeat=1))
Upvotes: -3
Views: 188
Reputation: 104812
When you want one process per CPU, often the best approach it to use multiprocessing.Pool
. I think this should do roughly what you want (the exact details of your code's logic are not obvious, since you're overwriting state_x
/y
/z
on each iteration, to no apparent effect).
import itertools
import multiprocessing
def do_work(x, y, z):
# do your per-job stuff here, e.g. make the os.system() call, or whatever
return result # return whatever value you need the worker to send to the main process
if __name__ == "__main__":
with multiprocessing.Pool() as pool:
results = pool.starmap(do_work, itertools.product(range(1, 301), repeat=3))
# do stuff with results here
A Pool
created with no arguments will, by default, create one process per CPU core. You can tell it to use a different number of processes if you prefer, but it shouldn't be necessary in most cases.
Upvotes: 0