DanielSon
DanielSon

Reputation: 1545

Shape mismatch with tf.placeholder

I'm using 128 x 128 x 128 ndarrays as input in to a cnn using:

# input arrays
    x = tf.placeholder(tf.float32, [None, 128, 128, 128, 1])

The each ndarray had no colur channel data, so I used:

data = np.reshape(data, (128, 128, 128, 1))

In order to get it to fit into the placeholder originally. But now I'm getting this error:

Traceback (most recent call last):
  File "tfvgg.py", line 287, in <module>
    for i in range(10000 + 1): training_step(i, i % 100 == 0, i % 20 == 0)
  File "tfvgg.py", line 277, in training_step
    a, c = sess.run([accuracy, cross_entropy], {x: batch_X, y: batch_Y})
  File "/home/entelechy/tfenv/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 717, in run
run_metadata_ptr)
  File "/home/entelechy/tfenv/lib/python3.5/site-packages/tensorflow/python/client/session.py", line 894, in _run
% (np_val.shape, subfeed_t.name, str(subfeed_t.get_shape())))
ValueError: Cannot feed value of shape (128, 128, 128, 1) for Tensor 'Placeholder:0', which has shape '(?, 128, 128, 128, 1)'

I'm confused about the way placeholders work, because I thought the first parameter was for the batch size. By using None there I thought that the placeholder would take any number of (128, 128, 128, 1) inputs. Because this is a 3d net, if I change the placeholder to (128, 128, 128, 1) there is an error thrown at the first conv3d layer for missing the parameter.

What am I missing about placeholder parameter passing?

Edit: (train_data is a list of lists, with each being [ndarray, label])

This is the initialisation of the net:

def training_step(i, update_test_data, update_train_data):

    for a in range(len(train_data)):

        batch = train_data[a]
        batch_X = batch[0]
        batch_Y = batch[1]

        # learning rate decay
        max_learning_rate = 0.003
        min_learning_rate = 0.0001
        decay_speed = 2000.0
        learning_rate = min_learning_rate + (max_learning_rate - min_learning_rate) * math.exp(-i / decay_speed)

        if update_train_data:
            a, c = sess.run([accuracy, cross_entropy], {x: batch_X, y: batch_Y})
            print(str(i) + ": accuracy:" + str(a) + " loss: " + str(c) + " (lr:" + str(learning_rate) + ")")


        if update_test_data:
            a, c = sess.run([accuracy, cross_entropy], {x: test_data[0], y: test_data[1]})
        print(str(i) + ": ********* epoch " + " ********* test accuracy:" + str(a) + " test loss: " + str(c))

        sess.run(train_step, {x: batch_X, y: batch_Y, lr: learning_rate})

for i in range(10000 + 1): training_step(i, i % 100 == 0, i % 20 == 0)

Upvotes: 1

Views: 628

Answers (2)

BlueSun
BlueSun

Reputation: 3570

In your last question you fed the network a list with one image: [image]. That is why the first dimension of data was not needed and reshaping to (128, 128, 128, 1) was enough. Feeding [image] or [image1, image2, image3] worked in the last example. However, now you are feeding the image without the list:batch[0] so the first dimension is gone and it does not work.

[np.reshape(image, (128, 128, 128, 1))] has overall shape of (1, 128, 128, 128, 1) and works

np.reshape(image, (1, 128, 128, 128, 1)) has overall shape of (1, 128, 128, 128, 1) and works too

np.reshape(image, (128, 128, 128, 1)) without the list has overall shape of (128, 128, 128, 1) and does not work.

You can either put the image back into a list or reshape it directly to (1, 128, 128, 128, 1). In both cases the overall shape will be correct. However if you also plan to input multiple images it is simpler to use the list, and fill it with (128, 128, 128, 1) shaped images.

In the way it is now, you can also use batch_X = [batch[0]] for 1 image and batch_X = batch[0:4] for multiple images

Upvotes: 1

kafman
kafman

Reputation: 2860

Your placeholder has rank 5 so you need to feed a 5-dimensional np array, but you reshaped to a 4-dimensional np array. So, use data = np.reshape(data, (1, 128, 128, 128, 1)) instead of data = np.reshape(data, (128, 128, 128, 1)) as pointed out in the comments.

Essentially, None in the shape of a placeholder means that the size of this dimension is variable, but the dimension should still be there.

Upvotes: 0

Related Questions