Reputation: 4307
This is my code
x_and_h = Concatenate()([x_t_emb, h_t])
And this is the error:
ValueError: `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 16), (1, 64)]
I've been trying various reshape operations to make the shapes compatible, but both these attempts failed:
h_t = tf.reshape(h_t, shape=tf.TensorShape([None, h_t.get_shape()[-1].value]))
x_t = tf.reshape(x_t, shape=tf.TensorShape([1, x_t.get_shape()[-1].value]))
Can somebody please explain to me what's going on, and how to solve the issue?
Upvotes: 1
Views: 648
Reputation: 183
Can you provide more details on the layers that you defined ?
Anyway, for the concatenate layer, you must have at least one dimension matching between your both tensors on the same axis.
To be simple, you can easily see that 'None' doesn't match with 1 and 16 doesn't match with 64.
In addition to that, trying to reshape as you are doing is wrong. Initially h_t is of shape (1, 64) and with :
h_t = tf.reshape(h_t, shape=tf.TensorShape([None, h_t.get_shape()[-1].value]))
you are trying to reshape it into a (None, 64) tensor which is conceptually wrong as None means that it can be any integer and usually represents your batch size.
Same problem while trying to do :
x_t = tf.reshape(x_t, shape=tf.TensorShape([1, x_t.get_shape()[-1].value]))
where your initial shape is (None, 16) while you are trying to reshape it into a fixed shape : (1, 16).
I think that you should try to make your h_t layer returning a tensor with shape (None, 1, 16) or (None, 16) so that you can either flatten and then concatenate for the first case and just use concatenate for the second.
Upvotes: 1