Danila Eremenko
Danila Eremenko

Reputation: 35

Creating of custom activation function in keras

I am trying to create my own custom activation function in keras, which would return 0 if x < 0 and 1 if x >= 0

 from keras.layers import Dense
 from keras.models import Sequential
 from keras.layers import Activation
 import tensorflow as tf


def hard_lim(x):

     zero = tf.convert_to_tensor(0., x.dtype.base_dtype)

     one = tf.convert_to_tensor(1., x.dtype.base_dtype)

     sess = tf.Session()

     if sess.run(tf.greater_equal(x, zero)):
         return one
     else:
         return zero

     model = Sequential()

     model.add(Dense(4, input_dim=2, activation=Activation(hard_lim))
     model.add(Dense(2, activation=Activation(hard_lim))
     model.add(Dense(1, activation=Activation(hard_lim))

It's giving me this error

 InvalidArgumentError (see above for traceback): You must feed a value           
 for placeholder tensor '1_input' with dtype float and shape [?,2]

How can I fix it?

Upvotes: 0

Views: 3003

Answers (1)

Daniel M&#246;ller
Daniel M&#246;ller

Reputation: 86600

Warning: this operation you want has no gradients and will not allow any weights before it to be trainable. You will see error messages like "an operation has None for gradient" or something like "None type not supported".

As a workaround for your activation, I believe the 'relu' activation would be the closest and best option, with the advantage of being very popular and used in most models.

In Keras, you don't usually run sessions. For custom operations, you create a function using backend functions.

So, you'd use a Lambda layer:

import keras.backend as K

def hardlim(x):
   return K.cast(K.greater_equal(x,0), K.floatx())

You can then use activation=hardlim in layers.

Upvotes: 2

Related Questions