flybywire
flybywire

Reputation: 274120

tradeoffs of different compression algorithms

What are the tradeoffs of the different compression algorithms?

The purpose is backup, transfer & restore. I don't care about popularity, as long as a mature enough tool exists for unix. I care about

the algorithms I am considering are

Upvotes: 1

Views: 571

Answers (4)

paxdiablo
paxdiablo

Reputation: 882756

It usually depends on your input data but I've never found anything that gives me better general compression than 7zip (http://www.7-zip.org).

Upvotes: 1

bill
bill

Reputation: 1361

The best way is to look at compression benchmark sites:

Maximumcompression

Compressionratings

Upvotes: 2

pauljwilliams
pauljwilliams

Reputation: 19225

It would be very simple to create a simple testbed for those cases.

Write a script that uses each in turn on a set of files that is representative of those you wish to comporess, and measure the time/cpu/memory usage/compression ratio acheived.

Rerun them a statistically significant number of times, and you'll have your answer.

Upvotes: 0

glmxndr
glmxndr

Reputation: 46616

Tar is not a compression algorithm per se.

You may use zip/gzip when time for compression/decompression is the most important issue.

You may use bzip when you need a better compression rate.

You may use LZMA when even bigger compression rate needed, but CPU time bigger.

Have a look here.

Upvotes: 2

Related Questions