Bart B
Bart B

Reputation: 669

Capturing the output of STDERR while piping STDOUT to a file

I have a rather odd situation. I'm trying to automate the backup of a collection of SVN repositories with Perl. I'm shelling out to the svnadmin dump command, which sends the dump to STDOUT, and any errors it encounters to STDERR.

The command I need to run will be of the form:

svnadmin dump $repo -q >$backupFile

STDOUT will go to the backup file, but, STDERR is what I need to capture in my Perl script.

What's the right way to approach this kind of situation?

EDIT: To clarify: STDOUT will contain the SVN Dump data STDERR will contain any errors that may happen

STDOUT needs to end up in a file, and STDERR needs to end up in Perl. At no point can ANYTHING but the original content of STDOUT end up in that stream or the dump will be corrupted and I'll have something worse than no backup at all, a bad one!

Upvotes: 2

Views: 1728

Answers (5)

Joel Berger
Joel Berger

Reputation: 20280

While tchrist is certainly correct that you can use handle direction and backticks to make this work, I can also recommend David Golden's Capture::Tiny module. It gives generic interfaces to capturing or tee-ing STDOUT and STDERR, from there you can do with them what you will.

Upvotes: 3

tchrist
tchrist

Reputation: 80384

This stuff is really easy. It’s what backticks were invented for, for goodness’ sake. Just do:

$his_error_output = `somecmd 2>&1 1>somefile`;

and voilà you’re done!

I don’t understand what the trouble is. Didn’t have your gazzintas drilled into you as a young child the way Jethro did? :)

Upvotes: 2

Wes Hardaker
Wes Hardaker

Reputation: 22252

Well, there are generic ways to do it within perl too, but the bash solution (which the above makes me think you're looking for) is to redirect stderr first to stdout and then redirect stdout to a file. intuitively this doesn't make a whole lot of sense until you see what's happening internally to bash. But this works:

svnadmin dump $repo -q 2>&1 >$backupFile

However, do not do it the other way (ie, put the 2>&1 at the end), or else all the output of both stdout and stderr will go to your file.

Edit to avoid some people's confusion that this doesn't work:

What you want is this:

# perl -e 'print STDERR "foo\n"; print "bar\n";' 2>&1 > /tmp/f 
foo
# cat /tmp/f
bar

and specifically you don't want this:

# perl -e 'print STDERR "foo\n"; print "bar\n";' > /tmp/f 2>&1
# cat /tmp/f
foo
bar

Upvotes: 7

andy
andy

Reputation: 7018

Here's one way:

{
    local $/;   # allow reading stderr as a single chunk

    open(CMD, "svnadmin dump $repo -q 2>\&1 1>$backupFile |") or die "...";
    $errinfo = <CMD>;    # read the stderr from the above command
    close(CMD);
}

In other words, use the shell 2>&1 mechanism to get stderr to a place where Perl can easily read it, and use 1> to get the dump sent to the file. The stuff I wrote about $/ and reading the stderr as a single chunk is just for convenience -- you could read the stderr you get back any way you like of course.

Upvotes: 4

toolic
toolic

Reputation: 62002

From perldoc perlop for qx:

To read both a command's STDOUT and its STDERR separately, it's easiest to redirect them separately to files, and then read from those files when the program is done:

  1. system("program args 1>program.stdout 2>program.stderr");

Upvotes: 1

Related Questions