Piggy Wenzhou
Piggy Wenzhou

Reputation: 305

Does dropout applies to regularization term in Keras?

I have a customized regularization term my_reg.

model = Sequential()
model.add(Dense(128, activation='relu'))
model.add(Dense(64, activation='relu'))
# The last layer is regularized with a custom regularizer
model.add(Dense(10, activation='softmax', W_regularizer=my_reg))
model.add(Dropout(0.5))

Will the Dropout(0.5) also applies to my_reg during the training process? If not, how can I make it to be true? Thanks in advance!

Upvotes: 1

Views: 80

Answers (1)

Dr. Snoopy
Dr. Snoopy

Reputation: 56347

Dropout works by dropping neurons, setting their activation to be zero, so this conceptually also affects the weights associated for that neuron, which you might consider as "applying" to any regularization terms, but note that the weights are never explicitly set to zero, so you will see little change on the regularization coefficient effect.

Upvotes: 1

Related Questions