Iceandele
Iceandele

Reputation: 660

Number of Bits per image?

So this was a question on our exam, and our study group is arguing about what is the correct answer. It's worded something like this.

You have a 100x100 picture, each pixel have 512 different levels for r,g,and b, and it can be either opaque or transparent, how many bits (0s and 1s) is need to represent this picture

My answer to this was 280000, and my thought process was the following:

512 different levels per color means it could be 0~511, or in binary 111111111, or 9 bits.

therefore, to represent (r g b) it would be (111111111 111111111 111111111) or 27 bits

to represent either opaque or transparent would take another bit (111111111 111111111 111111111 1) or 28 bits, that would be 28 bits total for a single pixel

now since there are 100x100 pixels, I'd need to multiply that by 28 bits per pixel. Giving me 280000

Am I correct? Or is there another answer for this?

Upvotes: 1

Views: 1329

Answers (1)

sleske
sleske

Reputation: 83577

Yes, that is correct (apart from the calculation error) - at least if you want to do it with the minimum number of bits.

In practice, the image size and number of bits per pixel would probably be constrained by what the hardware supports best. For example, that's why RGB data with 8 bit per R/G/B is usually stored in 32 bits, even though you only need 24 - most hardware can work with 32 bits more efficiently than with 24.

Upvotes: 1

Related Questions