chiz
chiz

Reputation: 83

I need to choose a compression algorithm

I need to choose a compression algorithm to compress some data. I don't know the type of data I'll be compressing in advance (think of it as kinda like the WinRAR program).

I've heard of the following algorithms but I don't know which one I should use. Can anyone post a short list of pros and cons? For my application the first priority is decompression speed; the second priority is space saved. Compression (not decompression) speed is irrelevant.

Upvotes: 8

Views: 5369

Answers (6)

P Varga
P Varga

Reputation: 20269

One of the fastest compression algorithms these days is LZ4, reportedly reaching RAM speed limits during decompression.

On the other hand, an algorithm generally providing the best compression ratios is LZMA2, used by xz and 7z. However, two caveats:

A good balance is provided by Zstandard, which is fast but can also provide ratios competitive with LZMA.

Another popular option nowadays is Brotli, which is more focused on speed rather than achieving the highest compression ratio. Support for both Zstd and Brotli Content-Encodings have recently been added to the HTTP protocol.

The winner in benchmarks used to be PAQ, however it isn't widely used and I couldn't find an actively maintained implementation of it.

Upvotes: 3

Andreas Bonini
Andreas Bonini

Reputation: 44832

I ran a few benchmarks compressing a .tar that contained a mix of high entropy data and text. These are the results:

Name  - Compression rate* - Decompression Time
7zip  - 87.8%             - 0.703s
bzip2 - 80.3%             - 1.661s
gzip  - 72.9%             - 0.347s
lzo   - 70.0%             - 0.111s

*Higher is better

From this I came to the conclusion that the compression rate of an algorithm depends on its name; the first in alphabetical order will be the one with the best compression rate, and so on.

Therefore I decided to rename lzo to 1lzo. Now I have the best algorithm ever.


EDIT: worth noting that of all of them unfortunately lzo is the only one with a very restrictive license (GPL) :(

Upvotes: 15

Maja Piechotka
Maja Piechotka

Reputation: 7216

In the Linux kernel it is well explained (from those included):

  • Deflate (gzip) - Fast, worst compression
  • bzip2 - Slow, middle compression
  • lzma - Very slow compression, fast decompression (however slower than gzip), best compression

I haven't use others, so it is hard to say, but speeds of algorithms may depend largely on architecture. For example, there are studies that data compression on the HDD speeds the I/O, as the processor is so much faster than the disk that it is worth it. However, it depends largely on the size of bottlenecks.

Similarly, one algorithm may use memory extensively, which may or may not cause problems (12 MiB -- is it a lot or very small? On embedded systems it is a lot; on a modern x86 it is tiny fragment of memory).

Upvotes: 5

Andras Vass
Andras Vass

Reputation: 11638

For a comprehensive benchmark on text data you might want to check out the Large Text Compression Benchmark.

For other types, this might be indicative.

Upvotes: 1

37Stars
37Stars

Reputation: 2499

Take a look at 7zip. It's open source and contains 7 separate compression methods. Some minor testing we've done shows the 7z format gives a much smaller result file than zip and it was also faster for the sample data we used.

Since our standard compression is zip, we didn't look at other compression methods yet.

Upvotes: 2

Ignacio Vazquez-Abrams
Ignacio Vazquez-Abrams

Reputation: 799560

If you need high decompression speed then you should be using LZO. Its compression speed and ratio are decent, but it's hard to beat its decompression speed.

Upvotes: 5

Related Questions