Jobs
Jobs

Reputation: 3377

Python - Image from numpy array losing true pixels

I have a jpg picture of a face, I need to access the picture pixel by pixel (know what value is at each pixel), and use some sort of DFS to change background color.

image = Image.open("pic.jpg")
image = np.array(image)

First of all, why is the shape of the array (473, 354, 3)? It doesn't make sense to me.

When I do

plt.imshow(image.reshape(473, -1))
plt.show()

I get a picture that looks like the following, which consists of only red, blue and yellow colors (and a mixture of the three?)

This means that the values in the array are not what I can reliably use to make my edge detection decisions.

Why and what should I do?

enter image description here

I want the pixel values to reflect the true color of the original image, not like above.

The background in the actual picture is kinda white, and I want them and all other pixel values to stay that way, so I can implement my algorithm.

Upvotes: 0

Views: 696

Answers (1)

Charles McGuinness
Charles McGuinness

Reputation: 21

The 3 is because each color (blue, green red) gets its own entry in the array.

For edge detection, you would might do best to collapse the image down to B&W. OpenCV has cv2.cvtColor(image, cv2.COLOR_BGR2GRAY) that will do the trick.

Upvotes: 1

Related Questions