Reputation: 4830
This works well except it merges the standard error stream into the standard out stream of the sourcing script. Any suggestions on how to fix it?
#!/usr/bin/env bash
# Source this from a script to capture and `tee` standard error and standard
# out to a log file. Calling script must source this script. For Example:
#
# . /usr/bin/logy /var/log/project/$0.log
#
# The logging uses >(process substitution). Process substitution is supported
# in shells like bash and zsh but is not supported in sh.
_LOG=${1?log file}
test $# -eq 1 || exit 1
mkdir -p "$(dirname "$_LOG")"
# Append stdout and stderr to log file
exec > >(
echo -e "START\t$(date)" >> "$_LOG"
tee -a "$_LOG"
echo -e "END\t$(date)" >> "$_LOG"
) 2>&1
Here is an example:
. /usr/bin/logy $0.log
echo stdout
echo stderr >&2
exit 1
Run the script:
$ ./t
$ echo $? # $? is the return value
1
Good, the return value 1 was preserved...
What was logged?
$ cat t.log
START Thu, Feb 07, 2013 2:58:57 PM
stdout
stderr
END Thu, Feb 07, 2013 2:58:57 PM
The idea is to make single log file and then use logrotate
to maintain them.
Here is the issue. The standard out and error streams were merged. This outputs noting showing that the standard error stream went to standard out:
./t > /dev/null
This outputs both lines from the echo statements showing both went to standard out:
./t 2> /dev/null
Is there a good way to preserve the streams while preserving the order in the log file of the stdout/err statements? For that reason, I don't think two exec
statements are an option.
Upvotes: 1
Views: 1112