Reputation: 4094
I use bash scripts in Linux to Migrate Database output files and I was wondering of a way to handle errors when executing a Linux command in a bash script.
For example normally when I want to loop through files in a directory I will write it like this
# list files and grep results for .sql extension
for FILE in `ls | grep ".sql"`
do
echo "found file: $FILE"
done
Which works perfectly because grep returns the file name if it has a .sql extension or returns nothing
I was wondering how to use a Linux command that returns a result or an error such as
ls ./*.sql
which returns the name of the file but if it doesn't find a file it returns the error
ls: ./*.sql: No such file or directory
Is it possible to check if a Linux command is returning an error in a bash script?
Upvotes: 2
Views: 1114
Reputation: 4209
You can check for errors with using &&
and ||
:
ls ./*.sql && echo "some files exist" || echo "no such files exist"
You can get the error code of the last run program with checking $?
:
ls ./*.sql ; echo $?
Upvotes: 2
Reputation: 2409
You can check the exit code of the last run command with the $?
variable in bash. Conventionally, 0
is returned for successful executions and 1
or higher for failed executions. As an example:
$ touch /root/test
touch: cannot touch ‘/root/test’: Permission denied
$ echo $?
1
You can incorporate this in your bash scripts like so:
#!/bin/bash
touch /root/test
if [ $? -eq 0 ]
then
echo "Successfully created file"
else
echo "Could not create file"
fi
Upvotes: 1
Reputation: 784868
First of all you don't really need ls
command here. You can just do:
# list files and grep results for .sql extension
for file in *.sql; do
echo "found file: $file"
done
Then to avoid getting single iteration if *.sql
files don't exist use:
shopt -s nullglob
# list files and grep results for .sql extension
for file in *.sql; do
echo "found file: $file"
done
In general you can check $?
after running any command to check for status of most recently executed command.
Upvotes: 2