Reputation: 237
I've seen many questions about parallelizing bash scripts but so far I haven't found one that answer my questions.
I have a bash script that runs two python scripts sequentially (the fact that are python script is not important though, it could be any other bash job):
python script_1.py
python script_2.py
Now, assume that script_1.py takes a certain (unknown) time to finish, while script_2.py has an infinite loop in it.
I'd like to run the two scripts in parallel, and when script_1.py finishes the execution I'd like to kill script_2.py as well.
Note that I'm not interested in doing this within the python scripts, but I'm interested to do this from a bash point of view.
What I thought was to create 2 "sub" bash scripts: bash_1.sh and bash_2.sh, and to run them in parallel from a main_bash.sh script that looks like:
bash_1.sh & bash_2.sh
where each bash_i.sh job runs a script_i.py script.
However, this wouldn't terminate the second infinite loop once the first one is done. Is there a way of doing this, adding some sort of condition that kills one script when the other one is done?
As an additional (less important) point, I'd be interested in monitoring the terminal output of the first script, but not of the second one.
Upvotes: 1
Views: 73
Reputation: 531075
It's simpler than you think. When bash_2.sh
finishes, just kill bash_1.sh
. The trick is getting the process id that kill
will need to do this.
bash_2.sh &
b2_pid=$!
bash_1.sh
kill $b2_pid
You can also use job control, if enabled.
bash_2.sh &
bash_1.sh
kill %%
Note that you don't need bash script for this; you can run your Python scripts directly in the same fashion:
python script_2.py &
python script_1.py
kill %%
Upvotes: 2
Reputation: 4969
If your scripts need to start in that sequence, you could wait
for the bash_1
to finish:
bash_1 &
b1=$!
bash_2 &
b2=$!
wait $b1
kill $b2
Upvotes: 5