Gadi Licht
Gadi Licht

Reputation: 21

Can't get PELU or SineRelu activation function to work in keras with contribute module

When I try to replace LeakyRELU or relu in a working coding with either SineRELU or PELU. I keep getting this error:

ValueError: Unknown activation function:PELU

I'm using the keras.contrib. I attached example code. I have tried it in several peaces of code. Any method of implementing this would be appreciated.

from keras.layers import Dense, Input, LeakyReLU, UpSampling2D, Conv2D, Concatenate
from keras_contrib.layers import SineReLU
from keras.models import Model,load_model,  Sequential
from keras.optimizers import Adam

# Recommended method; requires knowledge of the underlying architecture of the model
from keras_contrib.layers import PELU

import numpy
# fix random seed for reproducibility
numpy.random.seed(7)

# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]

# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='PELU'))
model.add(Dense(8, activation='PELU'))
model.add(Dense(1, activation='sigmoid'))

# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10)

# evaluate the model
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

# Create your first MLP in Keras
from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
numpy.random.seed(7)
# load pima indians dataset
dataset = numpy.loadtxt("pima-indians-diabetes.csv", delimiter=",")
# split into input (X) and output (Y) variables
X = dataset[:,0:8]
Y = dataset[:,8]
# create model
model = Sequential()
model.add(Dense(12, input_dim=8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))
# Compile model
model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10)
# evaluate the model
scores = model.evaluate(X, Y)
print("\n%s: %.2f%%" % (model.metrics_names[1], scores[1]*100))

Upvotes: 2

Views: 786

Answers (2)

Ekho
Ekho

Reputation: 379

Just before answering, I feel a bit amazed to have found a question about the SineReLU on Stack Overflow. I'm the guy who wrote the function. :)

So, the custom activation on Keras are called advanced activations and they extend the Layer class, found under keras.layers. After some changes in the Keras Contrib packaging, prior to their 1.0 release preparations, the SineReLU, along with other advanced activations, moved to the keras_contrib.layers.advanced_activations package.

But answering your question, to use the SineReLU, or any other advanced activation, please do:

from keras_contrib.layers.advanced_activations.sinerelu import SineReLU
...
model = Sequential()
model.add(Dense(128, input_shape = (784,)))
model.add(SineReLU())
model.add(Dropout(0.2))
...

You can also fine tune the SineReLU. To know more about its epsilon parameter, check the documentation here.

I also wrote a Medium story about it and gave a couple of talks at conferences about the function. You can find more resources here:

https://medium.com/@wilder.rodrigues/sinerelu-an-alternative-to-the-relu-activation-function-e46a6199997d

Upvotes: 3

Dr. Snoopy
Dr. Snoopy

Reputation: 56347

The problem is that you are not passing the activations correctly, the string format for the activation parameter of a layer only applies for built-in activations, not custom ones.

Additionally since the PELU has parameters, it is implemented as a layer, not as a standalone activation function, so you need to add it like this:

model = Sequential()
model.add(Dense(12, input_dim=8))
model.add(PELU())
model.add(Dense(8))
model.add(PELU())
model.add(Dense(1, activation='sigmoid'))

Upvotes: 1

Related Questions