Reputation: 793
I have an image that I have encoded and sent out using protobuf like so:
message.image = numpy.ndarray.tobytes(image)
when I receive and parse that message I use this:
image_array = numpy.frombuffer(request.image, numpy.uint8)
This gives me a one-dimensional array. I cannot get this back into an image format. I have tried using numpy's reshape command like so but with no luck:
image = image_array.reshape( 400, 600, 3 )
The image being sent is 400x600 pixels and it is a 3 channel color image. Any suggestions on what I am missing?
Upvotes: 4
Views: 6140
Reputation: 22954
You would also need to store the img.shape
data of the original image you wanted to encode and whole decoding you need that img.shape
value to reshape the matrix to it's original form as:
import numpy as np
# Create a dummy matrix
img = np.ones((50, 50, 3), dtype=np.uint8) * 255
# Save the shape of original matrix.
img_shape = img.shape
message_image = np.ndarray.tobytes(img)
re_img = np.frombuffer(message_image, dtype=np.uint8)
# Convert back the data to original image shape.
re_img = np.reshape(re_img, img_shape)
Upvotes: 3