Dims
Dims

Reputation: 50989

Keras LSTM shape doesn't contain length of sequence

I ran the following code

from keras import layers

input_shape = (1000, 10)
x = layers.Input(shape=input_shape)
print(x.shape)

lstm1 = layers.LSTM(input_shape=input_shape, units=50, return_sequences=True)
y = lstm1(x)
print(y.shape)

and got

(?, 1000, 10)
(?, ?, 50)

Why does first before last dimension of y is ?? Why isn't it 1000? Shouldn't LSTM layer with return_sequences=True repeat the same number of outputs?

Upvotes: 1

Views: 46

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86600

It does repeat, but the way Keras and Tensorflow communicate is something we don't need to worry about, unless you're doing a very deep research on something.

You can always invoke keras.backend.int_shape(y) to see the shapes Keras considers.
You can also invoke model.summary() to see the shapes.

Notice that .shape is a Tensorflow attribute, not a Keras attribute.

Although you see ? in there, it's 1000 indeed.

Upvotes: 1

Related Questions