Reputation: 27
I have a bash script that execute every 10 minutes (based on a crontab) multiple python script which create CSV, then move this files. It's basically this:
python3 first_script.py
wait;
sudo mv /script_dir/first_output.csv /var/www/html
wait;
python3 second_script
wait;
sudo mv /script_dir/second_output.csv /var/www/html
wait;
There is a lot of python scripts and sometimes, one of them doesn't execute. So the CSV isn't up to date.
Is it possible to write a log file when there is an error to know what is happening (and not have to check each python script one by one) ?
What I want is a txt file that I can check regularly like this: "yyyy mm dd hh mm ss : the file first_script.py doesn't works yyyy mm dd3 hh2 mm ss : the file second_script.py doesn't works"
and if possible: "the error was XXXXX"
Upvotes: 0
Views: 1182
Reputation: 181745
Since you only want the output in case of an error, you'll need to redirect the output to a temporary file first. Only if the script failed, write that file into your log. I'm assuming that your scripts are well-behaved so that they return a nonzero exit code on failure.
You can write a helper function to encapsulate all this:
function log_errors() {
script_log=$(mktemp)
"$@" > "$script_log" 2>&1
exit_code=$?
if [[ exit_code -ne 0 ]]; then
{
date
echo "\$ $(printf "%q " "$@")"
cat "$script_log"
echo "Failed with exit code $exit_code"
echo
} >> /var/log/my_scripts.log
fi
rm "$script_log"
}
log_errors python3 first_script.py
sudo mv /script_dir/first_output.csv /var/www/html
log_errors python3 second_script.py
sudo mv /script_dir/second_output.csv /var/www/html
Example output:
Wed Mar 16 03:48:13 PM CET 2022
$ python3 second_script.py
python3: can't open file '/tmp/second_script.py': [Errno 2] No such file or directory
Failed with exit code 2
Upvotes: 1