Reputation: 1370
I have data that I want to store into an image. I created an image with width 100 and height 28, my matrix has the same shape. When I use Image.fromarray(matrix)
the shape changes:
from PIL import Image
img = Image.new('L', (100, 28))
tmp = Image.fromarray(matrix)
print(matrix.shape) # (100, 28)
print(tmp.size) # (28, 100)
img.paste(tmp, (0, 0, 100, 28) # ValueError: images do not match
When I use img.paste(tmp, (0, 0))
the object is pasted into the image, but the part starting with the x value 28 is missing.
Why does the dimension change?
Upvotes: 1
Views: 7485
Reputation: 3
NumPy and PIL have different indexing systems. So a (100, 28) numpy array will be interpreted as an image with width 28 and height 100.
If you want a 28x100 image, then you should swap the dimensions for your image instantiation.
img = Image.new('L', (28, 100))
If you want a 100x28 image, then you should transpose the numpy array.
tmp = Image.fromarray(matrix.transpose())
More generally, if you're working with RGB, you can use transpose() to only swap the first two axes.
>>> arr = np.zeros((100, 28, 3))
>>> arr.shape
(100, 28, 3)
>>> arr.transpose(1, 0, 2).shape
(28, 100, 3)
Upvotes: 0
Reputation: 943
PIL and numpy have different indexing systems. matrix[a, b]
gives you the point at x position b, and y position a, but img.getpixel((a, b))
gives you the point at x position a, and y position b. As a result of this, when you are converting between numpy and PIL matrices, they switch their dimensions. To fix this, you could take the transpose (matrix.transpose()
) of the matrix.
Here's what's happening:
import numpy as np
from PIL import Image
img = Image.new('L', (100, 28))
img.putpixel((5, 3), 17)
matrix = np.array(img)
print matrix[5, 3] #This returns 0
print matrix[3, 5] #This returns 17
matrix = matrix.transpose()
print matrix[5, 3] #This returns 17
print matrix[3, 5] #This returns 0
Upvotes: 3