solopiu
solopiu

Reputation: 756

Custom keras activation function for different neurons

I have a custom keras layer and I have to create my custom activation function. Is it possible to put fixed activations for different neuron in the same layer? For example, let's say I have something like a Dense Layer with 3 units, and I want that the activation of the first unit is a relu, of the second one is a tanh and of the third one is a sigmoid; independently on the value of x, so that this is not ok:

def myactivation(x):
    if x something:
        return relu(x)
    elif something else :
        return another_activation(x)

What I want to do is apply an activation on a specific neuron as

def myactivation(x):
    if x == neuron0:
        return relu(x)
    elif x == neuron1:
        return tanh(x)
    else:
        return sigmoid(x)

Is this possible? Or there is another way to implement something like this?

Upvotes: 2

Views: 421

Answers (1)

Daniel Möller
Daniel Möller

Reputation: 86600

import keras.backend as K

def myactivation(x):
    #x is the layer's output, shaped as (batch_size, units)

    #each element in the last dimension is a neuron
    n0 = x[:,0:1]
    n1 = x[:,1:2]
    n2 = x[:,2:3]  #each N is shaped as (batch_size, 1)

    #apply the activation to each neuron
    x0 = K.relu(n0)
    x1 = K.tanh(n1)
    x2 = K.sigmoid(n2)

    return K.concatenate([x0,x1,x2], axis=-1) #return to the original shape 

Upvotes: 3

Related Questions