Reputation: 1563
I have a shell script which copies a few files to the current directory, compresses them, and streams the compressed file to stdout.
On the client side I use plink to execute the script and stream stdin to a file.
This almost works.
It seems that the cp command outputs the file name being copied when its executed from inside the script. If I execute 'cp /path/to/file1 .' in the shell it does it quietly; if I execute it in a script it outputs "file1".
How do I prevent this? I've tried piping the output of the cp command to /dev/null and to a dummy text file but with no luck.
thanks for any help.
#!/bin/bash
cp /path/to/file1 .
cp /path/to/file2 .
cp /path/to/file3 .
tar -cvzf package.tgz file1 file2 file3
cat package.tgz
file1
file2
file3
<<binary data>>
Upvotes: 0
Views: 1617
Reputation: 818
As others pointed out, the -v (verbose) option to tar is kicking out the file names to STDERR. You can also make your script more efficient by having tar write the compressed file stream to STDOUT:
tar zcf - file1 file2 file3
In this example, the "-" option passed as the filename makes tar write the output to STDOUT.
Upvotes: 0
Reputation: 9574
Aha! I'd always assumed that the file names emitted by tar go to stderr
, but that isn't always the case: only if you write your tar file to stdout
do the files written by -v
go to stderr
:
$ tar cvf - share > /dev/null
share/ # this must be going
share/.DS_Store # to stderr since we
share/man/ # redirected stdout to
share/man/.DS_Store # /dev/null above.
share/man/man1/
share/man/man1/diffmerge.man1
The counter-example:
$ tar cvf blah.tar share > /dev/null
This produced no list of file names because they got sent to /dev/null
.
I guess you learn something new every day. :-)
Upvotes: 2
Reputation: 45545
It's not cp, it's tar. You are passing it -v, which makes it print the names of the files.
Upvotes: 19