Reputation: 15
I have an issue optimising my bash script. I have few patterns i need to look for in the log file. If one of the patters is listed in the log file then do SOMETHING. So far i have this , but can and how can i optimize it without so many variables:
search_trace() {
TYPE=$1
for i in `find ${LOGTRC}/* -prune -type f -name "${USER}${TYPE}*" `
do
res1=0
res1=`grep -c "String1" $i`
res2=0
res2=`grep -c "String2" $i`
res3=0
res3=`grep -c "String3" $i`
res4=0
res4=`grep -c "String4" $i`
if [ $res1 -gt 0 ] || [ $res2 -gt 0 ] || [ $res3 -gt 0 ] || [ $res4 -gt 0 ]; then
write_log W "Something is done ,because of connection reset in ${i}"
sleep 5
fi
done
Upvotes: 1
Views: 72
Reputation: 149806
You could simply use alternation syntax in the regular expression you pass to grep
, e.g.
if grep -q -E '(String1|String2|String3|String4) filename'; then
# do something
fi
The -E
option makes grep use extended regular expressions (including the alternation (|
) operator).
Upvotes: 1
Reputation: 246827
search_trace() {
find "$LOGTRC"/* -prune -type f -name "$USER${1}*" |
while IFS= read -r filename; do
if grep -q -e String1 -e String2 -e String3 -e String4 "$filename"; then
write_log W "Something is done ,because of connection reset in $filename"
sleep 5
fi
done
}
grep's -q
option is good for use in an if-condition: it is efficient since it will exit successfully when it finds the first match -- it doesn't have to read the rest of the file.
Upvotes: 0