Kevin Little
Kevin Little

Reputation: 12976

Better way to make a bash script self-tracing?

I have certain critical bash scripts that are invoked by code I don't control, and where I can't see their console output. I want a complete trace of what these scripts did for later analysis. To do this I want to make each script self-tracing. Here is what I am currently doing:

#!/bin/bash
# if last arg is not '_worker_', relaunch with stdout and stderr
# redirected to my log file...
if [[ "$BASH_ARGV" != "_worker_" ]]; then
    $0 "$@" _worker_ >>/some_log_file 2>&1  # add tee if console output wanted
    exit $?
fi
# rest of script follows...

Is there a better, cleaner way to do this?

Upvotes: 8

Views: 2183

Answers (3)

acue
acue

Reputation: 401

you may check a common open source trace library with support for bash.

The current available component is for scripting by bash, soon available are Python and C++. Additional going to follow are: Ruby, Java, JavaScript, SQL, PowerShell,...

The license is Apache-2.0

WKR Arno-Can Uestuensoez

Upvotes: 0

Kevin Reid
Kevin Reid

Reputation: 43782

#!/bin/bash
exec >>log_file 2>&1

echo Hello world
date

exec has a magic behavior regarding redirections: “If command is not specified, any redirections take effect in the current shell, and the return status is 0. If there is a redirection error, the return status is 1.”

Also, regarding your original solution, exec "$0" is better than "$0"; exit $?, because the former doesn't leave an extra shell process around until the subprocess exits.

Upvotes: 13

Anycorn
Anycorn

Reputation: 51475

maybe you are looking for set -x?

Upvotes: 2

Related Questions