user3425082
user3425082

Reputation: 139

Dropout in Keras doesn't have any effects when changing learning_phase

inp = Input([10], name='Input')
X = Dense(10, activation='relu', kernel_initializer='glorot_uniform')(inp)
X = Dropout(0.5, seed=0)(X)
X = Dense(1, activation='relu', kernel_initializer='glorot_uniform')(X)
X = Dropout(0.5, seed=0)(X)
m = Model(inputs=inp, outputs=X)
u = np.random.rand(1,10)

sess.run(tf.global_variables_initializer())
K.set_learning_phase(0)
print(sess.run(X, {inp: u}))
print(sess.run(X, {inp: u}))
K.set_learning_phase(1)
print(sess.run(X, {inp: u}))
print(sess.run(X, {inp: u}))
print(m.predict(u))

Here is my code. When I run the model, I got the same result for each run. However, should the result change slightly when running the model because of the dropout layers?

Upvotes: 0

Views: 283

Answers (1)

layog
layog

Reputation: 4811

All the dropout layers have been seeded, which will make the model to drop the same neurons always. Now, you have a fixed input, fixed weights for each layer (not optimizing the model), and same dropout always, you will get the same output always.

Upvotes: 2

Related Questions