Zebra125
Zebra125

Reputation: 526

How are the hidden stacked LSTM layers interconnected ? Python

I search to draw the diagram of this RNN stacked LSTM based on this model:

model = Sequential()

model.add(LSTM(units=100, input_shape=(x_train.shape[1],5), return_sequences=True))
model.add(Dropout(0.2))

model.add(LSTM(50, return_sequences=False))
model.add(Dropout(0.2))

model.add(Dense(units=1))
model.add(Activation('linear'))

start = time.time()
model.compile(loss='mse', optimizer='rmsprop')

But by doing the bibliography I cannot find precisely how the LSTM cells are connected to each other and especially how the different layers of the neural network. I tried:

tf.keras.utils.plot_model(model, show_shapes=True, to_file="diagram_model.png")

But it didn't give me the result with precision and I also tried netron but the same result. so can you help me to find a solution? Thanks in advance.

Upvotes: 1

Views: 364

Answers (1)

M. Perier--Dulhoste
M. Perier--Dulhoste

Reputation: 1039

Tensorflow provides a utils function for this: tf.keras.utils.plot_model

import tensorflow as tf

... # code to build your model

tf.keras.utils.plot_model(model, show_shapes=True, to_file="diagram_model.png")

Upvotes: 2

Related Questions