Atirag
Atirag

Reputation: 1740

Apply own activation function to layer in tensorflow

I'm using a model where the tensorflow relu function is used for the activation of the hidden layers. So basically the model does this

h = tf.nn.relu(zw)

where zw are all the elements from the output from the previous layer times weights. According to the definition of relu of tensorflow it will return

max(zw,0)

so the max number between 0 and the value of each element of zw for each element of the tensor.

How can I apply my own relu function where I return the element zw if it is above 0 and the zw element times 0.1 if it is smaller than 0?

Upvotes: 1

Views: 487

Answers (1)

Avishkar Bhoopchand
Avishkar Bhoopchand

Reputation: 929

You could do something like this:

h = tf.where(zw < 0, 0.1 * zw, zw)

Upvotes: 1

Related Questions