pam
pam

Reputation: 13

How to change activation's of a layer using lambda function during training

I am new to keras and trying to modify the outputs of a layer during training. I want to write a function that takes the layer outputs and return the modeified outputs to the next layer during learning. I have tried using lambda functions but not really got hold of it.

def fun(x):
  a = min(x)
  y = np.round(x*(2**a))
return y

layer_1 = Dense(32, activation='relu')(input)
layer_2 = Dense(12, activation='relu')(layer_1)
lambda_layer = Lambda(fun, output_shape=(12,))(layer_2)
layer_3 = dense(32, activation='relu')(lambda_layer)

how can I get the layer outputs and modify them before passing it to next layer?

Upvotes: 1

Views: 124

Answers (2)

BGraf
BGraf

Reputation: 627

Using a lambda function is the right approach for your problem. However, keep in mind that the lambda function will be part of your computational graph and during training gradients have to be computed for the whole graph.

For example, you should not use the min() function as you did but rather use functions which are part of Keras Backend. Replacing all operations by their keras backend equivalent results in:

import keras.backend as K

def fun(x):
    a = K.min(x)
    y = K.round(K.dot(x, (K.pow(2, a))))
    return y

Your final model (and so all Lambda layers) should only contain native Keras functions, in order to safely perform all calculations during training.

Upvotes: 1

bluesummers
bluesummers

Reputation: 12627

This fails because you are using non-native operations (like np.round) inside a Lambda function, which expects keras operations

Examine the keras.backend docs, and take the functions you want to use from there.

So your function should look something like this

from keras import backend as K

def fun(x):
    a = K.min(x, axis=-1)  # Specify the axis you need!
    y = K.round(x*(2**a))

    return y

Upvotes: 0

Related Questions