Reputation: 1775
As the title suggests how do I write a bash script that will execute for example 3 different Python programs as separate processes? And then am I able to gain access to each of these processes to see what is being logged onto the terminal?
Edit: Thanks again. I forgot to mention that I'm aware of appending &
but I'm not sure how to access what is being outputted to the terminal for each process. For example I could run all 3 of these programs separately on different tabs and be able to see what is being outputted.
Upvotes: 12
Views: 38217
Reputation: 8943
Another option is to use a terminal emulator to run the three processes. You could use xterm (or rxvt etc.) if you are using X.
xterm -e <program1> [arg] ... &
xterm -e <program2> [arg] ... &
xterm -e <program3> [arg] ... &
Depends on what you want. This approach lets you pop up the terminal windows, so you can see the output in real time. You can also combine it with redirection to save the output as well.
Upvotes: 1
Reputation: 24083
You can run a job in the background like this:
command &
This allows you to start multiple jobs in a row without having to wait for the previous one to finish.
If you start multiple background jobs like this, they will all share the same stdout
(and stderr
), which means their output is likely to get interleaved. For example, take the following script:
#!/bin/bash
# countup.sh
for i in `seq 3`; do
echo $i
sleep 1
done
Start it twice in the background:
./countup.sh &
./countup.sh &
And what you see in your terminal will look something like this:
1
1
2
2
3
3
But could also look like this:
1
2
1
3
2
3
You probably don't want this, because it would be very hard to figure out which output belonged to which job. The solution? Redirect stdout
(and optionally stderr
) for each job to a separate file. For example
command > file &
will redirect only stdout
and
command > file 2>&1 &
will redirect both stdout
and stderr
for command
to file
while running command
in the background. This page has a good introduction to redirection in Bash. You can view the command's output "live" by tail
ing the file:
tail -f file
I would recommend running background jobs with nohup or screen as user2676075 mentioned to let your jobs keep running after you close your terminal session, e.g.
nohup command1 > file1 2>&1 &
nohup command2 > file2 2>&1 &
nohup command3 > file3 2>&1 &
Upvotes: 28
Reputation: 922
Try something like:
command1 2>&1 | tee commandlogs/command1.log ;
command2 2>&1 | tee commandlogs/command2.log ;
command3 2>&1 | tee commandlogs/command3.log
...
Then you can tail the files as the commands run. Remember, you can tail them all by being in the directory and doing a "tail *.log"
Alternatively, you can setup a script to generate a screen for each command with:
screen -S CMD1 -d -m command1 ;
screen -S CMD2 -d -m command2 ;
screen -S CMD3 -d -m command3
...
Then reconnect to them later with screen --list and screen -r [screen name]
Enjoy
Upvotes: 4