Reputation: 183
I have a LSTM feed-forward neural network as written below. For some reasons, I need to add a backward (recurrent) connection from layer3
to layer1
as well, which results in a loop in my model's architecture. How can I modify the below code to make such a loop in my model's architecture?
import tensorflow as tf
from tensorflow.keras.layers import Input, Dense, LSTM
from tensorflow.keras.models import Model
input_shape = ... # Shape of your input data
hidden_units1 = ... # Number of units in the first hidden layer
hidden_units2 = ... # Number of units in the second hidden layer
hidden_units3 = ... # Number of units in the third hidden layer
output_units = ... # Number of output units
# Define the input layer
input_layer = Input(shape=(input_shape,))
layer1 = LSTM(units=hidden_units1, activation='relu', return_sequences=True)(input_layer)
layer2 = LSTM(units=hidden_units2, activation='relu', return_sequences=True)(layer1)
layer3 = LSTM(units=hidden_units3, activation='relu')(layer2)
output_layer = Dense(units=output_units, activation='softmax')(layer3)
model = Model(inputs=input_layer, outputs=output_layer)
Upvotes: 1
Views: 49