ROBOTPWNS
ROBOTPWNS

Reputation: 4419

What do pixel values in an image array mean?

An image in matplotlib is stored as a numpy array. I am not exactly sure what the pixel values inside the numpy array mean. I understand that RGB value is represented by a 8-bit color, but if I have a colored image, shouldn't I get a vector of three 8-bit values representing (R,G,B)? Or is this the intensity of the image?

I also tried plotting a pixel histogram, and I see that the pixel value stops at 256 since pixel values are 8 bits, but I don't quite understand its significance.

Below is plotted from a image where I enhanced the background. LOW.png

Upvotes: 0

Views: 2164

Answers (1)

Hooked
Hooked

Reputation: 88128

Loading this image:

import pylab as plt
from matplotlib.pyplot import imread

A = imread("so-logo.png")
print A.shape

plt.imshow(A)
plt.show()

enter image description here

and looking at the shape gives (298, 1000, 4). Thus A is an array where the first dimension represents the height, the second the width and the third the color channel (RGBA). For example, the value A[180,45] gives the array:

[ 0.50588238  0.50588238  0.52156866  1.        ]

Which is about 50% red, green, blue (so grey) and completely opaque.

Upvotes: 1

Related Questions