Lynn Lin
Lynn Lin

Reputation: 3

python run multi command in the same time

Prior to this,I run two command in for loop,like for x in $set: command In order to save time,i want to run these two command in the same time,like parallel method in makefile

Thanks Lyn

Upvotes: 0

Views: 2135

Answers (3)

milkypostman
milkypostman

Reputation: 3043

The threading module won't give you much performance-wise because of the Global Interpreter Lock.

I think the best way to do this is to use the subprocess module and open each command with it's own stdout.

processes = {}
for cmd in ['cmd1', 'cmd2', 'cmd3']:
    p = subprocess.Popen('cmd1', stdout=subprocess.PIPE)
    processes[p.stdout] = p

while len(processes):
    rfds, _, _ = select.select(processes.keys(), [], [])
    for fd in rfds:
        process = processses[fd]
        print fd.read()

        if process.returncode is not None:
            print "Process {0} returned with code {1}".format(process.pid, process.returncode)
            del processes[fd]

You basically have to use select to see which file descriptors are ready and you have to check their returncode to see if doing a "read" caused them to exit. Processes basically go into a wait state until their stdout is closed. If you would like to do some things while you're waiting, you can put a timeout on select.select() so you'll stop waiting after so long. You can test the length of rfds and if it is 0 then you know that the timeout happened.

Upvotes: 3

richo
richo

Reputation: 8989

twisted or select module is probably what you're after.

If all you want to do is a bunch of batch commands, shell scripts, ie

#!/bin/sh
for i in "command1 command2 command3"; do
    $i &
done

Might work better. Alternately, a Makefile like you said.

Upvotes: 1

Mike DeSimone
Mike DeSimone

Reputation: 42805

Look at the threading module.

Upvotes: 0

Related Questions