neel
neel

Reputation: 9061

ValueError: could not broadcast input array from shape (224,224,3) into shape (224,224)

I have a list say, temp_list with following properties :

len(temp_list) = 9260  
temp_list[0].shape = (224,224,3)  

Now, when I am converting into numpy array,

x = np.array(temp_list)  

I am getting the error :

ValueError: could not broadcast input array from shape (224,224,3) into shape (224,224)  

Can someone help me here?

Upvotes: 118

Views: 589083

Answers (10)

biendltb
biendltb

Reputation: 1249

Numpy will auto-unify the array if it finds that there is <= 1 dimension different).

If you don't want to have a unified array (all elements are in the same shapes), you can try this workaround:

x = np.emtpy(len(temp_list), dtype=object)
for i, arr in enumerate(temp_list):
    x[i] = arr

Hope it helps!

Upvotes: 1

Jagesh Maharjan
Jagesh Maharjan

Reputation: 913

Yea, Indeed @Evert answer is perfectly correct. In addition, I'd like to add one more scenario that could encounter such an error.

>>> np.array([np.zeros((20,200)),np.zeros((20,200)),np.zeros((20,200))])

This will be perfectly fine. However, this leads to ValueError:

>>> np.array([np.zeros((20,200)),np.zeros((20,200)),np.zeros((20,201))])

ValueError: could not broadcast input array from shape (20,200) into shape (20)

The numpy arrays within the list must also be the same size.

Upvotes: 14

codingPhobia
codingPhobia

Reputation: 103

In my case problem was in my data set, Basically i was need to pre process on my data before further processing, because in my data set images are in random formats like RGB and grayscale, So dimensions mismatch. I simply follow Mudasir Habib's answer.

from PIL import Image
img = Image.open('my_image.jpg').convert('RGB')

Upvotes: 3

Zrn-dev
Zrn-dev

Reputation: 169

SOLVED - I got same arror : X_test = np.array(X_test, ) ValueError: could not broadcast input array from shape (50,50,3) into shape (50,50)

printed every images shape and got like this:

~

1708 : (50, 50, 3)

1709 : (50, 50)

1710 : (50, 50)

1711 : (50, 50, 3)

1712 : (50, 50, 3)

1713 : (50, 50, 3)

~

which means Mixed 1D and 3D datas after reading 2 different image folders and shuffling them

img: first one is Grayscale and second one is Color image

Added cv2.IMREAD_GRAYSCALE and problem is solved

Summary: in image data which I wanted to convert into np array contained different dimensional images

-> checked image data

-> found out that there are 1D, and 3D images

-> made 3D images Grayscale(1D)

-> problem is solved

Upvotes: 1

Hassan Shahzad
Hassan Shahzad

Reputation: 584

I totally agree with @mudassir's Answer. If you have agumented your dataset, then its highly likely that you get this error. As in most of the augumentation, it automatically applies grayscale effect which is actually two-dimensional whereas the original pictures (RGB) are three-dimensional. I myself was using roboflow's dataset that was already augumented and had the similar issue. I then removed the "graysclaing step" and still it gave the error. However, one i removed grayscale, hue and saturation, it worked like a charm. I would suggest you try that too.

Upvotes: 0

Wang Wei
Wang Wei

Reputation: 69

This method does not need to modify dtype or ravel your numpy array.

The core idea is: 1.initialize with one extra row. 2.change the list(which has one more row) to array 3.delete the extra row in the result array e.g.

>>> a = [np.zeros((10,224)), np.zeros((10,))]
>>> np.array(a)
# this will raise error,
ValueError: could not broadcast input array from shape (10,224) into shape (10)

# but below method works
>>> a = [np.zeros((11,224)), np.zeros((10,))]
>>> b = np.array(a)
>>> b[0] = np.delete(b[0],0,0)
>>> print(b.shape,b[0].shape,b[1].shape)
# print result:(2,) (10,224) (10,)

Indeed, it's not necessarily to add one more row, as long as you can escape from the gap stated in @aravk33 and @user707650 's answer and delete the extra item later, it will be fine.

Upvotes: 3

Mudasir Habib
Mudasir Habib

Reputation: 848

I was facing the same problem because some of the images are grey scale images in my data set, so i solve my problem by doing this

    from PIL import Image
    img = Image.open('my_image.jpg').convert('RGB')
    # a line from my program
    positive_images_array = np.array([np.array(Image.open(img).convert('RGB').resize((150, 150), Image.ANTIALIAS)) for img in images_in_yes_directory])

Upvotes: 8

Naman Bansal
Naman Bansal

Reputation: 331

@aravk33 's answer is absolutely correct.

I was going through the same problem. I had a data set of 2450 images. I just could not figure out why I was facing this issue.

Check the dimensions of all the images in your training data.

Add the following snippet while appending your image into your list:

if image.shape==(1,512,512):
    trainx.append(image)

Upvotes: 3

user707650
user707650

Reputation:

At least one item in your list is either not three dimensional, or its second or third dimension does not match the other elements. If only the first dimension does not match, the arrays are still matched, but as individual objects, no attempt is made to reconcile them into a new (four dimensional) array. Some examples are below:

That is, the offending element's shape != (?, 224, 3),
or ndim != 3 (with the ? being non-negative integer).
That is what is giving you the error.

You'll need to fix that, to be able to turn your list into a four (or three) dimensional array. Without context, it is impossible to say if you want to lose a dimension from the 3D items or add one to the 2D items (in the first case), or change the second or third dimension (in the second case).


Here's an example of the error:

>>> a = [np.zeros((224,224,3)), np.zeros((224,224,3)), np.zeros((224,224))]
>>> np.array(a)
ValueError: could not broadcast input array from shape (224,224,3) into shape (224,224)

or, different type of input, but the same error:

>>> a = [np.zeros((224,224,3)), np.zeros((224,224,3)), np.zeros((224,224,13))]
>>> np.array(a)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: could not broadcast input array from shape (224,224,3) into shape (224,224)

Alternatively, similar but with a different error message:

>>> a = [np.zeros((224,224,3)), np.zeros((224,224,3)), np.zeros((224,100,3))]
>>> np.array(a)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ValueError: could not broadcast input array from shape (224,224,3) into shape (224)

But the following will work, albeit with different results than (presumably) intended:

>>> a = [np.zeros((224,224,3)), np.zeros((224,224,3)), np.zeros((10,224,3))]
>>> np.array(a)
# long output omitted
>>> newa = np.array(a)
>>> newa.shape
3  # oops
>>> newa.dtype
dtype('O')
>>> newa[0].shape
(224, 224, 3)
>>> newa[1].shape
(224, 224, 3)
>>> newa[2].shape
(10, 224, 3)
>>> 

Upvotes: 119

Yinjie Gao
Yinjie Gao

Reputation: 121

You can covert numpy.ndarray to object using astype(object)

This will work:

>>> a = [np.zeros((224,224,3)).astype(object), np.zeros((224,224,3)).astype(object), np.zeros((224,224,13)).astype(object)]

Upvotes: 12

Related Questions