Tonio
Tonio

Reputation: 1736

GZip every file separately

How can we GZip every file separately?

I don't want to have all of the files in a big tar.

Upvotes: 64

Views: 53856

Answers (8)

Vadiraj k.s
Vadiraj k.s

Reputation: 59

fyi, this will help to overwrite if an existing gz file along with creating few gz file if it is not present:

find . -type f | grep "in case any specific" | grep -E -v "*.gz$" | xargs -n1 -P8 sh -c 'yes | gzip --force --best -f $0'

Upvotes: 0

Marinos An
Marinos An

Reputation: 10900

The following command can run multiple times inside a directory (without "already has .gz suffix" warnings) to gzip whatever is not already gzipped.

find . -maxdepth 1 -type f ! -name '*.gz' -exec gzip "{}" \;

A more useful example of utilizing find is when you want to gzip rolling logs. E.g. you want every day or every month to gzip rolled logs but not current logs.

# Considering that current logs end in .log and 
# rolled logs end in .log.[yyyy-mm-dd] or .log.[number]
find . -maxdepth 1 -type f ! -name '*.gz' ! -name '*.log' -exec gzip "{}" \;

Upvotes: 2

itsmisterbrown
itsmisterbrown

Reputation: 531

Or, if you have pigz (gzip utility that parallelizes compression over multiple processors and cores)

pigz *

Upvotes: 8

Mark Setchell
Mark Setchell

Reputation: 208107

Easy and very fast answer that will use all your CPU cores in parallel:

parallel gzip ::: *

GNU Parallel is a fantastic tool that should be used far more in this world where CPUs are only getting more cores rather than more speed. There are loads of examples that we would all do well to take 10 minutes to read... here

Upvotes: 57

Courtney Faulkner
Courtney Faulkner

Reputation: 2080

You can use gzip *


Note:

  • This will zip each file individually and DELETE the original.
  • Use -k (--keep) option to keep the original files.
  • This may not work if you have a huge number of files due to limits of the shell
  • To run gzip in parallel see @MarkSetchell's answer below.

Upvotes: 106

leekaiinthesky
leekaiinthesky

Reputation: 5603

After seven years, this highly upvoted comment still doesn't have its own full-fledged answer, so I'm promoting it now:

gzip -r .

This has two advantages over the currently accepted answer: it works recursively if there are any subdirectories, and it won't fail from Argument list too long if the number of files is very large.

Upvotes: 45

Buddy
Buddy

Reputation: 6713

If you want to gzip every file recursively, you could use find piped to xargs:

$ find . -type f -print0 | xargs -0r gzip

Upvotes: 14

Federico Giorgi
Federico Giorgi

Reputation: 10765

Try a loop

$ for file in *; do gzip "$file"; done

Upvotes: 8

Related Questions