Reputation: 2551
Android API provides the Bitmap.compress(format, quality, output)
method for saving bitmap objects. I created a sample app, which loads a jpeg image (some noisy camera photo) into the bitmap, then compresses it back to the same file. Then, does it again 5 times.
Obviously, my bitmap accumulates the compression artifacts. Surprisingly for me, the amount of this artifacts depends on the quality of the compression in a weird way. When I set quality to 100 (which I expect to be the best quality), the artifacts are distinctly visible. When I drop quality to 90, the artifacts are significantly less visible. Quality setting of 80 gives me the best results. With the quality setting of 70 and below, the image degenerates quickly.
When I compress bitmaps with the 100 quality, the resulting file's size linearly increases on every pass. For the quality settings of 90 and 80, the size of the resulting file stays about the same on every pass.
I've tested this behavior on Android 5 device (HTC One) and on Android 6 device (Motorola Moto G) and it was quite consistent. On Android 7 though, (Samsung S7) I could not spot any difference in the resulting images.
So, my question is why does compressing with quality = 80 gives us better results, than quality = 90 and especially quality = 100. I really expected saving images with 100 quality to be almost lossless (like they are in, say, GIMP)
Upvotes: 0
Views: 546
Reputation: 5042
It's hard to tell without seeing examples, but I'm assuming the artifacts you're noticing are high-frequency components (characterized by sharp, steep changes in brightness that only last for a pixel or two). Although not mandatory, many .jpeg compression algorithms will use particular quantization matrices that attenuate high frequency components more at lower quality settings, thus making room for the lower-frequency stuff that you might consider more "fundamental" to the image.
So, it's not hard to imagine that, at lower quality settings, high-frequency components would be "cut out" of the image, lowering the gradients and producing an overall "smoother" look. And, it's also not hard to imagine that, with high (but imperfect) quality, some high-frequency components could be exaggerated, and even amplified (due to quantization errors), over successive runs.
Put another way, JPEGs macroblocks are made up of linear combinations of the following 8x8 primitives, and at lower quality settings, the primitives closer to the right and the bottom are less likely to be present, so: no sharp edges.
(Original image from Wikimedia, see here)
Upvotes: 1
Reputation: 16409
JPEG is a lossy compression algorithm. Even if you use 100% quality, the decoded image will not be identical to the original image. So if you are encoding and decoding same image multiple times, it is common to find artifacts.
Use PNG algorithm if you don't want any losses.
My answer is kind of incomplete because 80% should not have better quality than 100% and if that happened in your case, I don't know why.
The loss isn't observable while encoding using Photoshop or GIMP probably because they have improved the algorithm. Better result in Android 7 than that in lower version also might be the result of same.
Upvotes: 0