Reputation: 4083
I have a Keras' code of declaring LSTM. But I noticed Container class has already been removed in the latest version. https://keras.io/layers/containers/
How do I declare multiple inputs for LSTM in the latest format? I want to concatenate all inputs for the LSTM inputs.
Although I noticed a similar post, what I want to do is declaration of the model. How to work with multiple inputs for LSTM in Keras?
```
g = Graph()
g.add_input(
name='i1',
input_shape=(None, i1_size)
)
g.add_input(
name='i2',
input_shape=(None, i2_size)
)
g.add_node(
LSTM(
n_hidden,
return_sequences=True,
activation='tanh'
),
name='h1',
inputs=[
'i1',
'i2'
]
)
```
Oh, May I just set input_shape as (i1_size+i2_size) like below?
model = Sequential()
model.add(LSTM(n_hidden, input_shape=(None, i1_size+i2_size), activation='tanh', return_sequences=True))
Upvotes: 3
Views: 1541
Reputation: 2653
You asked:
Oh, May I just set input_shape as (i1_size+i2_size) like below?
model = Sequential() model.add(LSTM(n_hidden, input_shape=(None, i1_size+i2_size), activation='tanh', return_sequences=True))
Yes, Jef. Just keep in mind that your None in (None, i1_size+i2_size) is the number of RNN time steps/input_length and there are caveats to when you can skip defining it. Please see the description for input_length
at https://keras.io/layers/recurrent/ for details.
And just FYI input_shape=(None, i1_size+i2_size)
can also be written as input_dim=i1_size+i2_size
(assuming you don't include input_length
).
Upvotes: 2