Reputation: 9170
This is my code that works if I use other activation layers like tanh:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(Activation(act))
model.add(Dropout(0.15))
model.add(Dense(64, init='uniform'))
model.add(Activation('softplus'))
model.add(Dropout(0.15))
model.add(Dense(2, init='uniform'))
model.add(Activation('softmax'))
sgd = SGD(lr=0.1, decay=1e-6, momentum=0.9, nesterov=True)
model.compile(loss='binary_crossentropy', optimizer=sgd)
model.fit(X_train, y_train, nb_epoch=20, batch_size=16, show_accuracy=True, validation_split=0.2, verbose = 2)
In this case, it doesn't work and says "TypeError: 'PReLU' object is not callable" and the error is called at the model.compile line. Why is this the case? All the non-advanced activation functions works. However, neither of the advanced activation functions, including this one, works.
Upvotes: 34
Views: 28623
Reputation: 655
For Keras functional API I think the correct way to combine Dense and PRelu (or any other advanced activation) is to use it like this:
focus_tns =focus_lr(enc_bidi_tns)
enc_dense_lr = k.layers.Dense(units=int(hidden_size))
enc_dense_tns = k.layers.PReLU()(enc_dense_lr(focus_tns))
dropout_lr = k.layers.Dropout(0.2)
dropout_tns = dropout_lr(enc_dense_tns)
enc_dense_lr2 = k.layers.Dense(units=int(hidden_size/4))
enc_dense_tns2 = k.layers.PReLU()(enc_dense_lr2(dropout_tns))
of course one should parametrize layers according to the problem
Upvotes: 5
Reputation: 1356
If using the Model
API in Keras you can call directly the function inside the Keras Layer
. Here's an example:
from keras.models import Model
from keras.layers import Dense, Input
# using prelu?
from keras.layers.advanced_activations import PReLU
# Model definition
# encoder
inp = Input(shape=(16,))
lay = Dense(64, kernel_initializer='uniform',activation=PReLU(),
name='encoder')(inp)
#decoder
out = Dense(2,kernel_initializer='uniform',activation=PReLU(),
name='decoder')(lay)
# build the model
model = Model(inputs=inp,outputs=out,name='cae')
Upvotes: 21
Reputation: 19902
The correct way to use the advanced activations like PReLU is to use it with add()
method and not wrapping it using Activation
class. Example:
model = Sequential()
act = keras.layers.advanced_activations.PReLU(init='zero', weights=None)
model.add(Dense(64, input_dim=14, init='uniform'))
model.add(act)
Upvotes: 33