Reputation: 8792
I have a pretty simple bash script that should grep a file for multiple phrases that I want to flag.
It works up until a point, but I am falling over when I want to grep for ' print ' or ' puts ' (note the whitespace before and after the word).
The grep is ignoring the whitespace in the input.
Here is my code (stuff that isn't relevant has been cut out)
#!/bin/sh
bad_phrases=('console.log(' 'binding.pry' ':focus,' 'alert(' ' print ' ' puts ')
bad_phrase_found=false
FILE='my_test_file.txt'
for bad_phrase in ${bad_phrases[*]} ; do
if grep -q "$bad_phrase" $FILE ; then
bad_phrase_found=true
echo "A '$(tput bold)$(tput setaf 1)$bad_phrase$(tput sgr0)' was found in $(tput bold)$(tput setaf 5)$FILE$(tput sgr0)"
fi
done
if $bad_phrase_found ; then
exit 1
fi
exit 0
I have looked at setting IFS to '~' and splitting the array up that way, but that killed the grep command completely.
The example output of the script is;
'print' was found in my_test_file.txt
Any help would be greatly appreciated.
Upvotes: 0
Views: 126
Reputation: 274592
You need to use "${bad_phrases[@]}"
. Don't forget the double quotes. You should always make it a habit to quote all your variables in order to suppress word splitting.
for bad_phrase in "${bad_phrases[@]}" ; do
if grep -q "$bad_phrase" "$FILE" ; then
bad_phrase_found=true
echo "A '$(tput bold)$(tput setaf 1)$bad_phrase$(tput sgr0)' was found in $(tput bold)$(tput setaf 5)$FILE$(tput sgr0)"
fi
done
Upvotes: 2