steve
steve

Reputation: 153

Adding dropout layer with the existence of advanced_activations layer

I have the following NN architecture using Keras:

from keras import Sequential
from keras.layers import Dense
import keras

model = Sequential()
model.add(Dense(16, input_dim=32))
model.add(keras.layers.advanced_activations.PReLU())

model.add(Dense(8))
model.add(keras.layers.advanced_activations.PReLU())

model.add(Dense(4))

model.add(Dense(1, activation='sigmoid'))

I wonder if it makes any difference to add model.add(Dropout(0.5)) before advanced_activations.PReLU() or after it. In another word, where is the correct place to add dropout layer with the existence of advanced_activations layer?

Thank you.

Upvotes: 0

Views: 796

Answers (1)

Dr. Snoopy
Dr. Snoopy

Reputation: 56377

It does not really matter if you do it before the activation or after, since for most activations f(0) = 0, then putting dropout after or before will produce the same result.

Upvotes: 2

Related Questions