Reputation: 6360
%matplotlib inline
from keras.preprocessing import image
import matplotlib.pyplot as plt
import numpy as np
img = np.random.rand(224,224,3)
plt.imshow(img)
plt.show()
img_path = "image.jpeg"
img = image.load_img(img_path, target_size=(224, 224))
print(type(img))
x = image.img_to_array(img)
print(type(x))
print(x.shape)
plt.imshow(x)
I have some code like this which should print the image. But it shows the image in wrong channels. What am i missing here?
Upvotes: 19
Views: 35441
Reputation: 816
This question is kind of old, but there is a very comfortable way to display images:
tf.keras.preprocessing.image.array_to_img(image[0]).show()
Your image has to have 3 dimensions (if its in a batch as normally, just take desired_element). Works fine on EagerTensors or numpy arrays.
Upvotes: 3
Reputation: 17201
This is a image scaling issue. The input to the imshow() expects it to be in the 0-1 range, while you are passing it a [0-255] range input. Try to view it as:
plt.imshow(x/255.)
Upvotes: 18