lockdoc
lockdoc

Reputation: 1689

Bash: Pipe stdout and catch stderr into a variable

Is it possible to pipe normal stdout further to another program, but store stderr into a variable?

This is the usecase:

mysqldump database | gzip > database.sql

In this scenario I would like to catch all errors/warnings produced by mysqldump and store them into a variable, but the normal stdout (which is the dump) should continue being piped to gzip.

Any ideas about how to accomplish this?

Upvotes: 0

Views: 1636

Answers (2)

muru
muru

Reputation: 4887

You could do something like:

errors=$(mysqldump database 2>&1 > >(gzip > database.sql))

Here, I'm using process substitution to get gzip to use mysqldump's output as stdin. Given the order of redirections (2>&1 before >), mysqldump's stderr should now be used for the command substitution.

Testing it out:

$ a=$(sh -c 'echo foo >&2; echo bar' 2>&1 > >(gzip > foo))
$ gunzip < foo
bar
$ echo $a
foo

Upvotes: 2

assefamaru
assefamaru

Reputation: 2789

You can do the following:

mysqldump database 2> dump_errors | gzip > database.sql
error_var=$( cat dump_errors )
rm dump_errors

Here, all errors by mysqldump are redirected to a file called 'dump_errors', and stdout is piped to gzip, which in turn writes to database.sql.

Contents of 'dump_errors' are then assigned to variable 'error_val', and file 'dump_errors' is then removed.


Note the following redirections:

$ sort 1> output 2> errors   # redirects stdout to "output", stderr to "errors"
$ sort 1> output 2>&1        # stderr goes to stdout, stdout writes to "output"
$ sort 2> errors 1>&2        # stdout goes to stderr, stderr writes to "errors"

$ sort 2>&1 > output         # tie stderr to screen, redirect stdout to "output"
$ sort > output 2>&1         # redirect stdout to "output", tie stderr to "output"

Upvotes: 3

Related Questions