Grant
Grant

Reputation: 1337

Waiting for bash script job to finish before starting another

I have a feeling that this isn't going to be as simple as I'm hoping it will be..

I understand the concept of using & and then wait in bash scripts but can this be applied to the same script being run multiple times while the first process still hasn't finished?

I'll try to explain what I mean better.

Say I have this script :

#/!/bin/bash

COMPLETE="download complete"
wget /root/downloads/ http://linktoareallymassivefile.wav &
wait;
echo $COMPLETE

Now forget the fact that running this actual script would just overwrite the previously downloaded file for a moment.

I execute it, it starts downloading, then I execute it again but I'd like the first process to finish before the second one starts.

So would something like this work? :

 #/!/bin/bash

wait;
COMPLETE="download complete"
wget /root/downloads/ http://linktoareallymassivefile.wav &
wait;
echo $COMPLETE &

I'm very much doubting that it would, but I think you can see what I'm asking.

Or, as I fear, is there a much more complicated queue based solution needed in this situation?

Upvotes: 0

Views: 3291

Answers (2)

unutbu
unutbu

Reputation: 880577

Each time you run the script, a new process is started. Each process is independent of every other process. wait will not affect any other script.

So either modify the script to consolidate all the commands:

wget /root/downloads/ http://linktoareallymassivefile1.wav
wget /root/downloads/ http://linktoareallymassivefile2.wav

Or make a new script to call the original script:

script.py
script.py

If you don't use & then the next command will not be executed until the first one finishes.

Upvotes: 1

Marcus Müller
Marcus Müller

Reputation: 36462

If you simply don't use & to push a process to background, and remove the wait, execution of wget will simply take as long as it takes.

Upvotes: 1

Related Questions