Reputation: 43491
I have:
model = Sequential()
model.add(LSTM(32, input_shape=(
SEQ_LENGTH, VECTOR_SIZE), return_sequences=True))
model.add(TimeDistributed(Dense(VECTOR_SIZE, activation='relu')))
adam_optimizer = optimizers.Adam(
learning_rate=0.001, beta_1=0.9, beta_2=0.999, amsgrad=False)
model.compile(loss='mean_squared_error',
optimizer=adam_optimizer)
The input and output of my model are both (100, 129)
.
Upvotes: 1
Views: 585
Reputation: 2750
model.add(BatchNormalization(center=True, scale=True, beta_regularizer=regularizers.l2(0.01),
gamma_regularizer=regularizers.l2(0.01),
beta_constraint='max_norm', gamma_constraint='max_norm',
input_shape=(x, y)))
It's just a layer that you add into your model
Upvotes: 2