dangerChihuahua007
dangerChihuahua007

Reputation: 20885

Why doesn't the shape of my numpy array change?

I have made a numpy array out of data from an image. I want to convert the numpy array into a one-dimensional one.

import numpy as np
import matplotlib.image as img

if __name__ == '__main__':

  my_image = img.imread("zebra.jpg")[:,:,0]
  width, height = my_image.shape
  my_image = np.array(my_image)
  img_buffer = my_image.copy()
  img_buffer = img_buffer.reshape(width * height)
  print str(img_buffer.shape)

The 128x128 image is here.

enter image description here

However, this program prints out (128, 128). I want img_buffer to be a one-dimensional array though. How do I reshape this array? Why won't numpy actually reshape the array into a one-dimensional array?

Upvotes: 4

Views: 10691

Answers (2)

John Vinyard
John Vinyard

Reputation: 13485

reshape doesn't work in place. Your code isn't working because you aren't assigning the value returned by reshape back to img_buffer.

If you want to flatten the array to one dimension, ravel or flatten might be easier options.

>>> img_buffer = img_buffer.ravel()
>>> img_buffer.shape
(16384,)

Otherwise, you'd want to do:

>>> img_buffer = img_buffer.reshape(np.product(img_buffer.shape))
>>> img_buffer.shape
(16384,)

Or, more succinctly:

>>> img_buffer = img_buffer.reshape(-1)
>>> img_buffer.shape
(16384,)

Upvotes: 5

wim
wim

Reputation: 362517

.reshape returns a new array, rather than reshaping in place.

By the way, you appear to be trying to get a bytestring of the image - you probably want to use my_image.tostring() instead.

Upvotes: 8

Related Questions