Reputation: 527
I've found it bizarre that numpy arrays and PIL images have different shape, in this case, (H,W)
in numpy and (W,H)
in PIL. My versions are,
Name: numpy Version: 1.13.3
Name: Pillow Version: 4.1.1
IMG = '/path/to/test-image.jpg'
import numpy as np
from PIL import Image
import matplotlib.pyplot as plt
%matplotlib inline
with Image.open(IMG) as img:
print img.size
img_np_f = np.asarray(img, order='F')
print img_np_f.shape
img_np_c = np.asarray(img, order='C')
print img_np_c.shape
plt.subplot(131)
plt.imshow(img)
plt.subplot(132)
plt.imshow(img_np_f)
plt.subplot(133)
plt.imshow(img_np_c)
plt.show()
The output goes,
(320, 240)
(240, 320)
(240, 320)
However, it seems matplotlib handles it correctly anyway.
Upvotes: 1
Views: 1746
Reputation: 14096
Because Numpy
is not an imaging library.
numpy.ndarray.shape
gives the shape in this order (H, W, D)
to stay coherent with the terminology used in ndarray
's axis (axis=0, axis=1, axis=2)
Upvotes: 1