Pent-
Pent-

Reputation: 43

Bash save/redirect stdout and stderr when program is killed

When ProgramA is killed by an external process (killed: 9), I cannot redirect its output (std/stderr) or save it to a variable.

ProgramA:

$ ./ProgramA arg1
  This is on stderr
  This is on stdout
  Killed: 9

Failing to save to a variable:

$ ProgramA_Output=`ProgramA arg1`
$ echo "$ProgramA_Output"

$

Redirection to a file also does not work:

$ ProgramA arg1 > output.txt
$ cat ./output.txt
$

Any clues to save / redirect the output?

Upvotes: 1

Views: 799

Answers (1)

Charles Duffy
Charles Duffy

Reputation: 295291

The most likely immediate cause here is that your program is only flushing its buffers on a line-by-line basis when output is to a TTY; hence, when redirected to a file or a FIFO, it hasn't flushed yet when the SIGKILL is delivered -- and since a SIGKILL can't be trapped or delayed, it has no opportunity to perform a flush at that time.

If you're on a GNU platform, you can use stdbuf to modify this behavior by default:

stdbuf -o0 ./ProgramA arg1 >output.txt

...or...

output=$(stdbuf -o0 ./ProgramA arg1)

Since you know that it flushes when output is to a tty (since output shows up immediately when run without redirection), you can also use unbuffer (a tool which ships with expect) to simulate that effect:

output=$(unbuffer ./ProgramA arg1)

However, the surest thing to do is to modify the source of ProgramA to explicitly perform a flush operation after every write you want to ensure is complete -- and to only use SIGKILL when absolutely required. (A common practice is to use a SIGTERM, wait for a substantial time period, and only then resort to a SIGKILL).

Upvotes: 1

Related Questions