User19
User19

Reputation: 121

running N process in parallel in linux

I am trying to create python script that will read commands from a file then run N of them simultaneously. This is what I have so far which does run N in parallel.

from subprocess import Popen
import time
with open('commands.txt') as f:
    commands = f.read().split('\n')
test_number = 20
while True:
    processes = [];time.sleep(6)
    for com in commands[test_number -20:test_number]:
        processes.append(Popen(com, shell=True))
        time.sleep(6)

    for i, process in enumerate(processes):
         process.wait()
         print(f"Command #{i} finished")
    test_number += 20
    

I need to put 6 seconds delay between each command, in my solution the fastest command will have to wait for the slowest one before running another command in the while(true). Is there a better way to do this?

Upvotes: 1

Views: 367

Answers (1)

Viktor Tóth
Viktor Tóth

Reputation: 499

You can use multiprocess.Pool to create a pool of subprocesses and run your shell scripts inside them, like it's described here.

import time
from subprocess import Popen
from multiprocessing import Pool

with open('commands.txt') as f:
    commands = f.read().split('\n')

def run_com(com, i):
    Popen(com, shell=True)
    # time.sleep(10)  # here only for testing
    print(f"Command #{i} finished")

test_number = 20
pool = Pool(test_number)
for i, com in enumerate(commands):
    time.sleep(6)  # wait 6 seconds between starting each command
    pool.apply_async(run_com, (com, i))  # run command in pool
pool.close()  # once tasks are completed the worker processes exit
pool.join()  # waits for all the tasks to finish

This way the pool will keep track of which of its worker processes are occupied and will assign a task to one if it's free, so your processes that take longer will not halt others.

Upvotes: 1

Related Questions