Dor
Dor

Reputation: 7494

logging blocks of code to log files in bash

I have a huge bash script and I want to log specific blocks of code to a specific & small log files (instead of just one huge log file).

I have the following two methods:

# in this case, 'log' is a bash function

# Using code block & piping
{
# ... bash code ...
} | log "file name"

# Using Process Substitution
log "file name" < <(
     # ... bash code ...
)

Both methods may interfere with the proper execution of the bash script, e.g. when assigning values to a variable (like the problem presented here).
How do you suggest to log the output of commands to log files?


Edit: This is what I tried to do (besides many other variations), but doesn't work as expected:

function log()
{
    if [ -z "$counter" ]; then
        counter=1
        echo "" >> "./General_Log_File" # Create the summary log file
    else
        (( ++counter ))
    fi
    echo "" > "./${counter}_log_file"   # Create specific log file

    # Display text-to-be-logged on screen & add it to the summary log file
    #  & write text-to-be-logged to it's corresponding log file
    exec 1> >(tee "./${counter}_log_file" | tee -a "./General_Log_File") 2>&1
}

log # Logs the following code block
{
    # ... Many bash commands ...
}

log # Logs the following code block
{
    # ... Many bash commands ...
}

The results of executions varies: sometimes the log files are created and sometimes they don't (which raise an error).

Upvotes: 1

Views: 1540

Answers (3)

Noam Manos
Noam Manos

Reputation: 16981

For simple redirection of bash code block, without using a dedicated function, do:

( 
  echo "log this block of code"
  # commands ...
  # ...
  # ...
) &> output.log

Upvotes: 0

Dor
Dor

Reputation: 7494

Thanks to Sahas, I managed to achieve the following solution:

function log()
{
    [ -z "$counter" ] && counter=1 || (( ++counter ))

    if [ -n "$teepid" ]; then
        exec 1>&- 2>&-  # close file descriptors to signal EOF to the `tee`
                #  command in the bg process
        wait $teepid # wait for bg process to exit
    fi
    # Display text-to-be-logged on screen and
    #  write it to the summary log & to it's corresponding log file
    ( tee "${counter}.log" < "$pipe" | tee -a "Summary.log" 1>&4 ) &
    teepid=$!
    exec 1>"$pipe" 2>&1 # redirect stdout & stderr to the pipe
}

# Create temporary FIFO/pipe
pipe_dir=$(mktemp -d)
pipe="${pipe_dir}/cmds_output"
mkfifo "$pipe"
exec 4<&1   # save value of FD1 to FD4

log # Logs the following code block
{
    # ... Many bash commands ...
}

log # Logs the following code block
{
    # ... Many bash commands ...
}

if [ -n "$teepid" ]; then
    exec 1>&- 2>&-  # close file descriptors to signal EOF to the `tee`
            #  command in the bg process
    wait $teepid # wait for bg process to exit
fi

It works - I tested it.

References:

Upvotes: 0

Sahas
Sahas

Reputation: 11399

You could try something like this:

function log()
{
    local logfile=$1
    local errfile=$2
    exec > $logfile
    exec 2> $errfile    # if $errfile is not an empty string
}

log $fileA $errfileA
echo stuff
log $fileB $errfileB
echo more stuff

This would redirect all stdout/stderr from current process to a file without any subprocesses.

Edit: The below might be a good solution then, but not tested:

pipe=$(mktemp)
mknod $pipe p
exec 1>$pipe

function log()
{
    if ! [[ -z "$teepid2" ]]; then
        kill $teepid2
    else
        tee <$pipe general_log_file &
        teepid1=$!
        count=1
    fi

    tee <$pipe ${count}_logfile &
    teepid2=$!
    (( ++count ))
}

log
echo stuff
log
echo stuff2

if ! [[ -z "$teepid1" ]]; then kill $teepid1; fi

Upvotes: 1

Related Questions