AME
AME

Reputation: 71

Custom activation function Keras: Applying different activation to different layers

The i/p to my custom activation function is going to be a 19 * 19 * 5 tensor say x. The function needs to be such that it applies sigmoid to the first layer i.e x[:,:,0:1] and relu to the remaining layers i.e. x[:,:,1:5]. I have defined a custom activation function with the following code:

def custom_activation(x):
    return tf.concat([tf.sigmoid(x[:,:,:,0:1]) , tf.nn.relu(x[:,:,:,1:5])],axis = 3)

get_custom_objects().update({'custom_activation': Activation(custom_activation)})

The fourth dimension comes into picture because at the input I get at the function custom_activation has batch size as another dimension. So the input tensor is of shape[bathc_size,19,19,5].

Could someone tell me if this is the correct way to do it?

Upvotes: 2

Views: 1365

Answers (1)

alta
alta

Reputation: 353

Keras Activations are designed to work on arbitrarily sized layers of almost any imaginable feed forward layer (e.g. Tanh, Relu, Softmax, etc). The transformation you describe sounds specific to a particular layer in the architecture you are using. As a result, I would recommend accomplishing the task using a Lambda Layer:

from keras.layers import Lambda

def custom_activation_shape(input_shape):
     # Ensure there is rank 4 tensor
     assert len(input_shape) == 4
     # Ensure the last input component has 5 dimensions
     assert input_shape[3] == 5

     return input_shape  # Shape is unchanged

Which can then be added to your model using

Lambda(custom_activation, output_shape=custom_activation_shape)

However, if you intend to use this transformation after many different layers in your network, and thus would truly like a custom defined Activation, see How do you create a custom activation function with Keras?, which suggests doing what you wrote in your question.

Upvotes: 1

Related Questions