tyanne
tyanne

Reputation: 1

Why is my byte array size smaller than the image size?

I am trying to convert an image (specifically Spotify album covers which are a constant size of 640x640) to a byte array which will then be used to display the image on a 32x32 RGB matrix. This is the code:

import urllib.request

url = "https://i.scdn.co/image/ab67616d0000b2734c63ce517dd44d0fbd9416da"
path = "test.jpg"


def image_byte(url, file_path):
    urllib.request.urlretrieve(url, file_path)
    with open(path, "rb") as image:
        f = image.read()
        b = bytearray(f)
    return b


b = image_byte(url, path)
print(len(b))

The length of the byte array comes out to be 175742, why? Shouldn't the byte array size be 640 x 640 x 3 = 1228800?

Sorry if I am missing something huge, this is my first project with an RGB matrix and anything to do with image conversion, thanks in advance.

Upvotes: 0

Views: 488

Answers (1)

ascpixi
ascpixi

Reputation: 587

This is because the image is compressed and is in the JPEG format. If the returned data would be the raw pixel values, then you would be right in your calculations; however, pretty much all services will return image data in a compressed form.

You can decode the compressed image with a library like Pillow:

from PIL import Image

with open("example_image.jfif") as img_data:
    img = Image.open(img_data)
    actual_data = list(img.getdata()) # actual_data will hold all pixel values

Upvotes: 3

Related Questions