Reputation: 1375
I am trying to run a file that contains a sequence of commands/scripts to run with arguments, like:
ls /etc/
cat /etc/hosts/
script.sh some parameters
...
This seems to work fine but in some cases the while loop will end prematurely. This seems to be the case only when the scripts it is executing contains SSH/SCP at the end. The code to read the file:
while IFS= read -r line
do
# Cut command and parameters
IFS=', ' read -a parameters <<< "$line"
cmd="${parameters[0]}"
unset parameters[0]
runScriptAndCheckError "$cmd" "${parameters[@]}"
done < "$SCRIPT_FILENAME"
When using set -x:
+ checkError 0 'ERROR: script.sh failed'
+ '[' 0 -ne 0 ']'
+ IFS=
+ read -r line
It looks like there is no more input although there is still lines in the file. If I comment out runScriptAndCheckError "$cmd" "${parameters[@]}" then it does print a lot more lines.
I am not sure what is wrong with this code. I'd be really helpful if someone could please help.
Upvotes: 1
Views: 970
Reputation: 800
The key is as @chepner states, that the statements inside:
read line; do
<command>
done
Will interfere with the loop if they attempt to read from stdin.
If you don't want your script reading from stdin, prevent it from doing so as follows:
read line; do
cmd < /dev/null
done
Now your read loop will not be missing any input.
Upvotes: 1
Reputation: 531165
If runScriptAndCheckError
also reads from standard input, it will read lines from $SCRIPT_FILENAME
. Have the read
command in the while
loop use a different file descriptor:
while IFS= read -r line <&3; do
...
done 3< "$SCRIPT_FILENAME"
Upvotes: 4