Reputation: 2686
I want that all the error on the commands executed in my shell script are written inside a log file and this is simple executing
exec 2>> /var/log/mylog.txt
but what if i want to add to each line a date before the error?
Upvotes: 1
Views: 950
Reputation: 51990
If you are using bash
you have access to co-processes that might serve that purpose:
#!/bin/bash
# The co-process responsible to add the date
coproc myproc {
( bash -c 'while read line; do echo $(date): ${line}; done' 3>&1 1>&2- 2>&3- )
}
# Redirect stderr to the co-process
exec 2>&${myproc[1]}
# Here my script -- classical; no (visible) redirection
ls non-existant-file1 existant-file non-existant-file2
Saving the above as t.sh
:
sh$ touch existant-file
sh$ ./t.sh 2> error.log
existant-file
sh$ cat error.log
Tue Jul 15 00:15:29 CEST 2014: ls: cannot access non-existant-file1: No such file or directory
Tue Jul 15 00:15:29 CEST 2014: ls: cannot access non-existant-file2: No such file or directory
Upvotes: 2
Reputation: 51990
The first option that came to mind is to use a fifo, and some redirections:
I maintain this answer as it might be sufficient; but other options are available -- see my other answer
#!/bin/sh
TEMPDIR=`mktemp -d`
mkfifo "${TEMPDIR}/fifo"
(awk '{"date" | getline the_date; print the_date ": " $0; fflush() }' < "${TEMPDIR}/fifo" ) &
exec 2> "${TEMPDIR}/fifo"
rm -f "${TEMPDIR}/fifo"
#
# Your commands here
#
exec 2>&-
Upvotes: 2
Reputation: 212238
Create a pipe and run your stderr through a perl script. Something like:
#!/bin/sh
trap 'rm -f $F' 0
F=$(mktemp)
rm $F
mkfifo $F
perl -ne 'print localtime() . ": " . $_' < $F >&2 &
exec 2> $F
As written, this prints timestamps and messages to the same stderr as the script had when it began, so you can append to a log file by redirecting when the script is run. Or you can hard code the redirection on the line that invokes perl
.
Upvotes: 1