Bartek Sadlej
Bartek Sadlej

Reputation: 15

LSTM input longer than output

I am not sure if I understand how exactly Keras version of LSTM works. Let's say I have vector of len=20 as input and I specify keras.layers.LSTM(units=10) So in this example does the network finish after processing 50% of input or it precess the rest from start (I mean from first cell)?

Upvotes: 0

Views: 231

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86610

Units are never related to the input size.
Units are related only to the output size (units = output features or channels).

An LSTM layer will always process the entire data and optionally return either the "same length (all steps)" or "no length (only last step)".


In terms of shapes

You must have an input tensor with shape (batch, len=20, input_features).

And it will output:

  • For return_sequences=False: (batch, output_features=10) - no length
  • For return_sequences=True: (batch, len=20, output_features=10) - same length

Output features is always equal to units.


See a full comprehension of the LSTM layers here: Understanding Keras LSTMs

Upvotes: 1

Related Questions