sfactor
sfactor

Reputation: 13062

uncompressing a large number of files on the fly

I have a script that I need to run on a large number of files with the extension **.tar.gz*.

Instead of uncompressing them and then running the script, I want to be able to uncompress them as I run the command and then work on the uncompressed folder, all with a single command.

I think a pipe is a good solution for this but i haven't used it before. How would I do this?

Upvotes: 1

Views: 471

Answers (3)

aularon
aularon

Reputation: 11110

The -v orders tar to print filenames as it extracts each file:

tar -xzvf file.tar.gz | xargs -I {} -d\\n myscript "{}"

This way the script will contain commands to deal with a single file, passed as a parameter (thanks to xargs) to your script ($1 in the script context).

Edit: the -I {} -d\\n part will make it work with spaces in filenames.

Upvotes: 1

johnsyweb
johnsyweb

Reputation: 141780

The following three lines of bash...

for archive in *.tar.gz; do
    tar zxvf "${archive}" 2>&1 | sed -e 's!x \([^/]*\)/.*!\1!' | sort -u | xargs some_script.sh
done

...will iterate over each gzipped tarball in the current directory, decompress it, grab the top-most directories of the decompressed contents and pass those as arguments to somescript.sh. This probably uses more pipes than you were expecting but seems to do what you are asking for.

N.B: tar xf can only take one file per invocation.

Upvotes: 1

strager
strager

Reputation: 90012

You can use a for loop:

for file in *.tar.gz; do tar -xf "$file"; your commands here; done

Or expanded:

for file in *.tar.gz; do
    tar -xf "$file"
    # your commands here
done

Upvotes: 0

Related Questions