Cloud Wolf
Cloud Wolf

Reputation: 33

Compression results from EZW paper not reproducible?

I have been doing some research into the Embedded zerotrees of wavelets algorithm as a reference method for a journal. I found a couple of implementations of it in Matlab, python, and C and was able to run a few of them on the gray Lena test image as well as the gray Mandrill image. The PSNR vs CR curves from all of them varied and the best methods were still off from the source by a compression rate factor of ~5x. I decided to implement it myself in Matlab, following the paper and Wikipedia example to a T. After running mine I got a small improvement but still drastically more bytes per DB than the 1993 Shapiro paper. I looked for other journals that reference it but have not found one that runs the test images, preferring to just copy Shapiro's results. The only thing I have differently is that the encoder I use is Huffman and not the "adaptive arithmetic encoding". I will attach my qualitative and quantitative results here, any suggestions would be appreciated.

Paper: https://www.di.univr.it/documenti/OccorrenzaIns/matdid/matdid862310.pdf

Wiki: en.wikipedia.org/wiki/Embedded_zerotrees_of_wavelet_transforms

For these tests, I ran EZW for 5 levels of decomposition with Huffman encoding. I also tested "wcompress" function for ezw and got better results but still worse than the paper.

enter image description hereenter image description here

Upvotes: 1

Views: 74

Answers (0)

Related Questions